Jump to content
Sign in to follow this  

Remote File Permissions

Recommended Posts


Hello all,

I keep having users accidentally moving chunks of files around on our file server and am looking to implement something to prevent this. For example, our "Patient Files" server has a massive directory tree, it is first divided into "Active", "Inactive", "Archived" (our code word for deceased) and "Collections". Each of those has 26 subdirectories, A through Z, each of those directories contain patient folders such as "Adams, Don", "Adams, Sam", etc. Within each patient folder are more subfolders and eventually the actual data files. We too often find situations like having the "Adams, Sam" folder moved inside of some other patients folder. Or find all the "B" patients have been moved into the "C" folder.

I can set NTSF file permissions on the server (Windows Server 2003 x64 SP2) to deny remote users (Windows XP SP3) the ability to modify folders and subfolders, but still allow them full control of files. That solves my problem, except... I have a few scripts that run remotely and need to be able to execute DirCreate() and DirMove() on the server.

I've been searching the forums, so far it's seems like RunAs() and RunAsWait() are my only options? Maybe I could have those commands execute simple batch files that would reside on the server, run with Administrator rights, and accept parms to create, move or delete the necessary folders?

Has anyone played with this sort of thing?

Any ideas?


Edit: PS - We have no domain defined, no Active Directory, just simple peer-to-peer file sharing.

Edit2: I seem to have found some more pertinent forum threads searching on "RunAsWait" and "Batch". Am thinking my answer will be found in one of those.

Edited by Spiff59

Share this post

Link to post
Share on other sites
  • How does the Staff create a new patient folder? Do they input the info into a program and the program creates the directory? It sounds like they are manually dragging/dropping the folders into the inappropriate place. Or do you have just a few people that can create a new patient folder, while all Staff can manipulate the files and folders?
  • Since you do not have AD running, do the remote users have separate or generic user ids? If separate, then you must have added their user id/password to the local server.
  • You are on the right track with the RunAs solution. I would create a special Admin user to run your scripts...this would allow you to Audit the access if you need to in the future. You can disable Local login for this user since there is no need for this account to log in, only run programs with elevated privileges. The only issue that I see is if the Staff need to manually create a new directory but they do not have rights to rename folders. You would have to ensure that that their procedure is to locally create and name the folder, then move it to the server, otherwise they (may) only be able to create folders like New Folder, New Folder (1), New Folder (2), etc. if they use Windows Explorer while on a folder located on the Server.
Edited by Varian

Share this post

Link to post
Share on other sites

Thanks, Varian.

Yes, I've had a program in place for some time that gathers and presents data pulled from our canned patient accounting program, our home-grown medical records system, and our digital x-ray database. It also creates new patient folders, as well as moving folders around, for instance from "Inactive" to "Active". It is intended to be the sole means of manipulating patient data at the folder level, and is where I'll need to make changes if I implement the new NTFS restrictions. Prior to this program, the number of typos introduced when users were allowed to enter data manually were enough to turn ones hair grey.

Our problems revolve around the use of a (POS) program named PaperPort, one that the doctors and staff have used for years. It took me forever to convert nearly a half-a-million files from PaperPort's proprietary .max format to something more universally palatable like .doc and .pdf. Anyway, PaperPort is basically an overblown Windows Explorer, and exposes the full directory structure of our patient files to everyone. Fast clicking, mousing and tabbing periodically results in pieces of the folder tree being accidentally pruned and grafted.

I've been playing with the following code to get an idea of what I'd need to insert into my existing script once I've put the new permissions on the folders:

Local $sServer = "Raptor"
Local $sUserName = "Administrator"
Local $sPassword = ""

$folder = "Testo Folder"
$source = "\\Raptor\D\Test\" & $folder
$dest = "\\Raptor\D\Temp\" & $folder

$mkdir = 'MKDIR "' & $dest & '"'
$xcopy = 'XCOPY "' & $source & '" "' & $dest & '" /E /I /Y'
$rmdir = 'RMDIR "' & $source & '" /S /Q'


$sourcecount = DirGetSize($source,1)
$sourcecount = $sourcecount[1] + $sourcecount[2]
$destcount = DirGetSize($dest,1)
$destcount = $destcount[1] + $destcount[2]
If $sourcecount = $destcount Then

Func Remote_Execute($cmd)
    Local $pid = RunAsWait($sUserName, $sServer, $sPassword, 2, @ComSpec & " /c " & $cmd, "", @SW_HIDE)

It's appears a move will require a MKDIR, an XCOPY, and an RMDIR.

The RunAsWait() documentation states it returns "the exit code of the program that was run", but that is not correct, it seems instead to return a PID (Guess I'll go start a BugTrack on that). So, I've tried using DirGetSize() to confirm that files were copied successfully. There may be a better way...

Edit: I'm thinking the example code I borrowed from wasn't cleaned up when copied from the RunAs() documentation. I don't see that ProcessWaitClose() should be necessary with RunAsWait().

Edited by Spiff59

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this