Hardy Beta - smbfs/cifs issues
This bug report was converted into a question: question #28523: Hardy Beta - smbfs/cifs issues.
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
samba (Ubuntu) |
Invalid
|
Undecided
|
Unassigned |
Bug Description
"smbfs" is not installed by default, which it probably should be if this is to be used in the enterprise. Or is there an alternative, as I have been using "smbfs" for a few releases now and am not sure if how to mount SMB shares has changed for Hardy.
I installed smbfs, and then put some entries into /etc/fstab, so they automount on startup. An example of this is here:
//<ip address of nas box>/<share name> /home/hamish/
(The username and password are in the .smbcredentials file)
On startup, for each entry in the /etc/fstab file, I get the following in dmesg:
[ 70.495504] CIFS VFS: Error connecting to IPv4 socket. Aborting operation
[ 70.495569] CIFS VFS: cifs_mount failed w/return code = -101
But the shares are mounted, and a nautilus session opens up (which is also annoying...)
Also logging off with CIFS shares mounted in /etc/fstab, it sits with an error message:
CIFS VFS: server not responding
CIFS VFS: no response for cmd 50 mid 775
And takes about 2 minutes to timeout. This also happened with Gutsy, and was fixed using a script I found on the Ubuntu Forums.
Is there an alternative way to mount the shares so that this doesn't happen, or should the timing of the mounting and dismounting be changed so that it works? I think it is related to the starting of network-manager and CIFS shares trying to connect on startup *before* the network is up, and dismounting the shares *after* network-manager is stopped.
Hamish
description: | updated |
Smbfs will be eventually going away try using cifs.