Another one of those semi random requests that turned out to be more of a pain in the arse than it really should be. the mission: Converting a large mass of scanned tiff CAD drawings into multi page PDF’s
There are two ways to do this fairly quickly, the easy way if you don’t have many individual sets to work with is to simply drag all the tif images into the window in Acrobat Pro and save the file from there.
If however, like us, you have to convert 42 gigs or so of tiff’s spread across 204 folders and group the PDF’s by the folders, you can programmatically do it via some batch scripting and the use of a few free command line utilities from the internet.
First thing to do is convert each individual TIF file to a PDF, for this, we use Imagemagick this is a tool that can do many things but we’re really just going to abuse the ‘convert’ application for now.
After installation, you can then in the dos prompt type ‘convert file.tif file.pdf’
For our purposes we want to convert ALL of the files, so we can set it up in a recursive for loop using
Saved into a file “Convert all Tiffs in Subdirectory to PDFs.bat” or something equally exciting:
FOR /R %%a IN (*.tif) DO convert "%%a" "%%a.pdf"
After that is done we will use a second utility to ‘join’ the pdf’s the PDF toolkit
With this util we can modify pdf files quickly; we’ll use the Append function
pdftk *.pdf cat output combined.pdf
This will concatenate all pdf’s in the current directory and write them into ‘combined.pdf’ in the same directory. Great!
To expand this into a large subdirectory we do a little more batch scripting
The below is saved into a file “Join all PDF in directory tree.bat”
FOR /F "delims=" %%a in ('DIR /S /B /AD ^|SORT') DO (
pdftk *.pdf cat output "%%a_joined.pdf"
This will output a combined PDF for each of the individual directories.
A good time was had by all.
Imagemagick looks to have quite a few useful applications and will warrant some more exploration soon.
Today I had to find all JPG files over a certain size in a large directory tree of a few thousand images, to find some poorly compressed jpg’s, fix the compression, and replace them on a website. Todays cavaet, the CMS is pretty weak and only lets you update 1 image at a time, so you can’t just blindly resave everything and reupload, todays Second cavaet, I dont have server level access to said website, so all the normal automated ways are out of luck.
Anyway! the code:
It searches through a directory tree of your choosing
for any file matching the filetype: “jpg”
that is above a size threshold: fileSizeThreshold (In this case, 2kb.. so essentially everything)
and outputs it into an excell sheet with full path and size information
so that you can easily Sort, Filter, Delegate, or use as data for further automating!
' // ************************************** ' // ComputerHighGuys recursive search ' // ' // Date Created: 20 Aug 07 ' // http://www.tek-tips.com/faqs.cfm?fid=6716 ' // ' // Adjusted to a jpg file search and ' // added excel output for easy sorting/filtering ' // ' // ************************************** ' // If we'd like to save the output into an excel sheet WriteExcel = "True" Dim objexcel excelRow=2 ' // value the filesize needs to exceed to be visible in the output fileSizeThreshold=2 If WriteExcel = "True" Then ' // Create the Excel sheet to drop the information into Set objExcel = createobject("Excel.application") objexcel.Workbooks.add objexcel.Cells(1, 1).Value = "Folder Name" objexcel.Cells(1, 2).Value = "Filename" objexcel.Cells(1, 3).Value = "Filesize" objexcel.Cells(1, 4).Value = "Filesize Unit" objexcel.Visible = True Wscript.Sleep 300 End If ' // Directory to search searchDir = "C:\" set objFSO=CreateObject("Scripting.FileSystemObject") Set objFolder = objFSO.GetFolder(searchDir) Set colFiles = objFolder.Files ' // Launch the function ScanSubFolders(objFolder) ' // recursive function that will search through all subfolders for a specified ' // filetype and output some information about the files to an Excel Sheel Sub scanSubFolders(objFolder) ' // Grab sub folders Set colFolders = objFolder.SubFolders For Each objSubFolder In colFolders ' // the files to search Set colFiles = objSubFolder.Files For Each objFile in colFiles ' // the extension of the filetype to search for If lcase(Right(objFile.Name,3)) = "jpg" Then ' // Getting File size in KB If round(objFile.Size/1024,1) > fileSizeThreshold Then ' // echo the files to the console WScript.Echo objSubFolder.Path & " " & objFile.Name & " " & round(objFile.Size/1024,1) & "KB" ' // write various shit to excel~ If WriteExcel = "True" then wscript.Echo objSubFolder.Path & " " & objFile.Name & " " & round(objFile.Size/1024,1) & "KB" objexcel.Cells(excelRow, 1).Value = objSubFolder.Path objexcel.Cells(excelRow, 2).Value = objFile.Name objexcel.Cells(excelRow, 3).Value = round(objFile.Size/1024,1) objexcel.Cells(excelRow, 4).Value = "KB" excelRow=excelRow+1 End If End if End If Next ScanSubFolders(objSubFolder) Next End Sub
Its been a while since I’ve posted anything, so its time for the habitual blog sorry/script update!
We recently picked up VrayRT, and i found myself again vnc’ing over to farm machines to restart services and well, that just wont do!
So I did a small update to the ServerUtilities script adding a few services, and doing a little work on the underbelly of it to make it a bit more stable. also packaged it into an installer because I was getting a few people that had just installed it strangely and were having issues.
– .65 – added vray 2010 to services
– .75 – added vrayRT
– .75 – convert to simpler dos prompt
– .75 – added an optional pause to the end of the commandline based tools
– .75 – fixed the invert button in server selection
– .75 – prints the commandline to the listener
– .80 – convert to dos based loop for multiple systems (if pause: hit enter to advance to next machine in loop..)
– .85 – update to support spaces in the max file path! oops!
Here’s partial results from last nights maxscript silliness.
QuickCollapse is a maxscript Struct with a few functions to speed up collapsing large numbers of objects, it provides feedback in the listener window so you can see progress as it goes.
Run script will trigger ‘CollapseSelected’ on your current selection. there isn’t much Error Checking involved, so filter your selection for meshes, and remeber that it SHUTS UNDO OFF for the collapse so that we dont have RAM overruns on large objectsets, so save/hold first eh?
EDIT 2013: Here’s an update with UI and a few other useful mesh cleanup functions: DOWNLOAD
Well, lets just say that dealing with large Revit files can get a little ugly for visualization purposes to say the least. there are several fundamental issues currently, ranging from workflows in building usable familys inside of revit, to dealing with the geometry after the fact for rendering.
I had the pleasure to again deal with a fairly large Revit model. It only weighed in at about 300mb to start! for working on final renderings thats all well and good, and not a real issue to deal with, but for mid project progress renders it can get a bit painful doing a lot of the deconstruction needed to make it useful to work in in 3dsMax. spending several hours ‘cleaning’ a revit model in max so that you can get any sort of rendering done is lets say, a bit depressing, when you know you’ll have to do the same process again in 2 weeks.
all that said, I started looking for some solutions to one small facet of the problem, and found some good case study testing on efficiently attaching a ton of meshes done by Dave Stewart and also a fair number of tips from the Maxscript crew at CGTalk
So I did a small adaptation of Dave’s attachment script above, which can be found here: CollapseSelected-inParts5.ms Its not pretty, but we’ll get there soon enough.
It takes your current selection, and simplifies the number of objects by the square root (Dave’s tested optimal amount for speed collapsing!) its a work in progress, and i figure i’ll take this, combined with set of other tools for collapsing either by Selected Material, Similar Objects+Instances, Layer, and some name filtering. that should take the day long revit cleanup jobs and compress them down to an hour or so.
We have a fun little virtools app that loads in various NMO’s exported out from max on the fly as needed based on user input, so in the process of working out what our process would be, we came to the need to have each seperate 3dsmax hierarchy exported out into a seperate virtools NMO file.
Pretty simple, but the question came up on a forum today and its been a useful little script that I’ve been using for years now.
the guts are:
--wanky recursive function to parse hierarchy into an array fn addChildrenToArray theChildren currentObjsToExport = ( for c in theChildren do ( append currentObjsToExport c addChildrenToArray c.children currentObjsToExport ) ) fn massExportfn hier path filetype = ( exportSelection = selection as array if hier == true then ( baseNodesToExport = for o in exportSelection where o.parent == undefined collect o ) else baseNodesToExport = exportSelection --parse through said root nodes for o in baseNodesToExport do ( oldPos = o.position o.position = [0,0,oldPos.z] --children returns a 'NodeChildrenArray, so convert that to an array manually. --include the current node in the array no matter what. currentObjsToExport = #(o) -- if we're packaging hierarchys then collect childrem if hier == true do ( addChildrenToArray o.children currentObjsToExport ) select currentObjsToExport --random info for use to ogle at the export.. yea yea format "\tExport:\t%\tas\t%\n" o.name filetype format "\tTo:\t%\n" (path + "/" + o.name + filetype) format "\n-----------------------------\n\n" -- if we need to save selected use this export type if filetype == ".MAX" then ( savenodes currentObjsToExport (path + "/" + o.name + filetype) quiet:true ) else ( --export using the name of the root node as the filename exportfile (path + "/" + o.name + filetype) #noPrompt selectedOnly:true ) o.position = oldPos ) select exportSelection ) -- and its used by: massExportfn true "C:/temp/" ".vmo"
Theres an interface for it here, but i’ve got it built into many other small little pipeline specific utils, so that may not be of any real use to anyone else, but this little snip might be useful for someone out there.
This handles the services perfectly for me here, I’m doing a little more work on it to support using the server.exe and vrayspawner exe versions as well, but that wont be finished for a while as its a spare time project. I’m really looking for feedback in terms of platforms you’re looking to use it on, and what features would be nice to see in addition.
Install is pretty straightforward, simply unzip into your Scripts directory.. after a bit more testing for various people I’ll throw together an installer, but for now its slightly manual.
after you unzip it, assuming your max is installed in say, C:\3dsmax08\ you should see several files
Maxscript files: Access them as Maxscript Pulldown, Run Script.
C:\3dsMax08\server_tool_lite_05.mcr — Run this to install the script into the Customize UI, under category dbScripts.
C:\3dsMax08\ServerUtilities\server_tool_lite_05.ms — Run this to try out the tool without installing it into the Customize ui..
C:\3dsMax08\ServerUtilities\… There will be a few other exe’s in this directory, mostly commandline utilities that the script calls to batch modify things.
After you run the script the first time, it will create a file C:\3dsMax08\ServerUtilities\ServerToolLite.ini which will save your settings. In the script, it might be easier for you to add 1 server, and then browse to the ServerToolLite.ini file and manually add the other 19 servers in notepad.
A couple simple tricks you can do with the remote process execution of the Slave Utils for fun. (Also corresponding commandlines for use with the psexec, which is what the slave utils is batching
you can start a remote command line on any of the machines in your list simply by doing a cmd.exe in the ‘ProcessToStart’ line.. its a small thing, but its helpful for certain things.. ie: copying files from one remote machine to another remote machine.. its *MUCH* faster if you do something like that FROM one of the remote machines, and skip the middleman.
How to start VNC on a remote machine, and have it work interactively… when you’re not allowed to install it as a service! (Read: Our IT uses its own copy of vnc that we aren’t allowed to Co-Opt for our own uses.. so we end up with 2 copies running!)
Apparently its tricky to run interactive remote processes that are allowed to interact with the desktop under full local privileges?! Go figure. Anyway, This is what you would enter to launch the vnc with the same privileges as if you were at that computer and hit the shortcut.
cmd.exe /c start “c:\program files\Realvnc\winvnc\winvnc.exe” “c:\program files\Realvnc\winvnc\winvnc.exe”
what this converts to in actual command line is something along the lines of:
psexec \\computer -i -d -u DOMAIN\Login -p password cmd.exe /c start “c:\program files\Realvnc\winvnc\winvnc.exe” “c:\program files\Realvnc\winvnc\winvnc.exe”
so it used the login and pass to launch the remote cmd.exe, but then used the local system privileges with the start command. allowing it to have full network and desktop interaction privileges.
More to come later.
- Run the .HTA once.. it will create 2 files for you
- “Computers.ini” which you will will need to edit to contain 1 slave computer per line in it (either IP address or computer Name work, use 127.0.0.1 for your local system)
- “SlaveutilsOptions.ini” which will have slots for various paths and options. read the file and edit the lines accordingly.
- (Path to your VNC client,
- whether to use alternate credentials or not,
- and what the login/password would be if you did use them.
Note: Its been brought to my attention that some virus scanners will flag this download (Understandably in my opinion) as ‘Greyware’ because it contains VBScript that uses the WMI (Windows Management Interface) to affect changes to remote computers, enclosed in an HTA (Read: Executable Webpage). the full code is in there and easily readable.
I’ve been experimenting with a few other ways to package it, either via Maxscript, or Python, but they tend to get constrained to ‘free’ time as this functions perfectly