EDIT 6: 5-12 Version:
- Added in option for making the hydra controller forward be the look direction.
- Added some basic swarming boxes/drones in to the scene, and a sword to play with to beat said swarm up. you’ll find options on the wall menu to enable the sword, add or remove from the swarm, and kill them all.
- Added a webcam plane to the scene, and also one to the corner of the player camera for use as a ‘hud’ with head mounted cameras.. It might be interesting with a wide angle lens, but my webcam was pretty boring! It is neat to see the video playback ‘huge’ on the wall though. that was nicer quality than I thought it would be.
- It can come in handy when setting yourself up with the Hydra as head tracking gig if you have an ourward facing camera on the laptop like I do.
- Re-arranged the hand hierarchy so we can grab/move/throw things with both hands now, which should lead to scaling objects shortly.
“Dad, are you making this stuff? You gonna be wacking the things? It is fun to play with” – My 6 year old, after smacking swarmed boxes, and throwing cubes at the webcam picture of my face for 5 minutes and giggling uncontrollably.
Just updated the contents of the ZIP, same download link: Download HERE
It’s about time to do another video soon, sorry I haven’t yet, video recording is not nearly as much fun as working on random things!
EDIT5: 5-8 Version:
- Added this ‘Hydra as positional head tracking‘ thing that everyone is talking about, there’s a button on the wall to enable it, just attach the hydra to your head somehow, then hit the button for fun and profit.
- There’s a light saber here, It doesn’t really do anything yet, but I’m sure you can see where this must go (Especially if you can ‘dodge’ someone elses attack.. *wink*
- You can throw the player now with the LeftButton2, not for the faint of heart, but kinda fun.
- Did some experimenting with a large particle system, basically a a point cloud from microsoft ImageSynth, experiment failed when you can’t stop the particle system from blinking out of existence when it falls out of your camera frustrum, making it pointless to try and have a huge (in world scale) particle system, It still looks pretty RAD, as long as you keep the rough center of the system in camera, you can check it out by hitting right Shift key, then selecting Cloud from the scenes menu
- Disabled the ‘Gaze Look’ when there is no rift attached as it doesn’t make sense in HydraOnly modes
- fixed a case where the camera direction could end up NaN and the world goes black. (sounds ominous no?)
EDIT3: Met up with Ben Lang the other day to geek out and view all the latest community demos, He did a nice write up and video of my toys: you should read it
EDIT2: I’m throwing an update out here with a bunch of new stuff, mostly aimed at making it easier to get someone up and running than explaining 40 button combinations!
- Smoothed out the startup process of configuring the HYDRA some.
- Integrated a UI system with the hydra hands! there are Context menu items on wrist menu, floating buttons in the world space, and menu/toolbars that are in the player space
- the BUMPER button on the controllers is the equivalent of mouse click
- the pointer finger on the hands is the equivalent of the mouse position
- right now there is a shortcut on Right-3 to show/hide a radial menu with a few toys in there. (Looking down there is a button near your feet to do the same thing
- Scale the world (bit buggy and slow, but an interesting effect to change the world scale to giant (Try throwing one of the blocks when it is the size of a house for instance..)
- Change the color of the lines that you can draw using the Right-1
- The RightTrigger will pick things up, you can now throw them
- The LeftTrigger will copy objects..
- There is a right wrist menu (eh..) If you are holding an object with the right hand, you can move your left hand over your wrist to pop up a context menu to toggle gravity on that object, delete object, or copy object
- there is a ‘delete last 5 copied objects’ button out on one of the walls incase you copy a million things and it starts to get slow (There is no optimization in here yet, it’s just a clusterfuck of a playground’
- There is a ‘GAZE STEERING’ thing which somewhat works, there is a button on the wall to show the colliders used for that, which will pretty much explain it better than words.. but basically, if you look ~90 degrees to the right, it will rotate your body slowly to the right.
New download link: Download HERE
Got a chance to do a little more work on my prototype, It’s shaping up to be a lot of fun to mess with. There’s still a ways to go before it ends up like that ol world building video everyone’s seen, but It’s coming along nicely!
I squashed a lot of bugs, made the base environment faster, and I found a few more fun things that I think everyone should try~!
first video capture! It’s come a long way since this already.
General control theme: Right hand to interact with scene, Left hand to modify how we interact.
Neat notes and changes from the previous version:
- You can now grab objects with the [Right Trigger]
- objects that you can drag also affect the player, so for instance, Grab one of the Blue platforms, and stand on it.. now lift up your hand.. but not tooo far! now that you’ve got floating down, use the [LeftThumbstick] to add forward. and you can also use the [RightThumbstick] to rotate.. BANG! Accidental magic carpet/hoverboard/flying effect. – We could probably do with simplifying the controls some, but this sort of happened by accident with the stuff I was making!
- While you are moving objects, you can ‘Clone’ them with the [Left Trigger]
- you can also ‘Toggle gravity’ on them using the [Left Button 1]
- you can also ‘Delete’ them with the [Right Wrist Menu]
- It’s possible to create pretty neat mousetrap type things with just the cubes/platforms and basketballs.
- most of the scene is now grabbable/movable… try raising the roof! you can really make a good old mess here, and it’s a lot more fun than I was expecting to take apart a building. I think it’ll be even more fun when we can put the scenes together. (SOOOON)
- some of the higher res bits you may not want to clone too many times, like the basketball hoop.. It’s not a very optimized model right now, so each copy adds a lot of draw calls.
- You can still draw lines with the [RightButton1]
- You can still ‘shoot’ balls with the [RightButton4]
- You can still free move the player with the [LeftButton2]
AKA: Things I play with instead of sleeping, after my son goes to bed.
EDIT: hey there people coming from Bruces video! check out the newer version for a nicer demo: cheers!
Architecture – an interactive basketball arena 3d scene to wander around. Loosely based off of a project that we didn’t get paid for, so, mileage ++!
Hydra – for movement, and glory!
Line work – Spatial Graffiti, draw lines in 3d space with the hydra, It’s more fun than you’d think.
Basketball – kinda. It wasn’t too exciting, so it’s more or less removed by now.
How it works:
- Rift, head tracking turns the player’s viewpoint, it just works? Thanks oculus!
- Hydra, unity integration is nice, thanks!
- Base unit:
- Leave yourself enough room between the unit and the monitors! NO MONITOR PUNCHING ON MY WATCH!
- I’ve found it works best if you have the controllers ‘docked’ at game start. It just saves some configuration hassle.
- Pick up the controllers from the base station, move them to a comfortable position closer to your body, about as far away from each other as they appear in the rift view, and click the ‘Start’ buttons (The vertical button between the 1 and the 2 buttons) to take control of the hands.
- Base unit:
- Right Hand:
- Thumb stick turns the player body.
- Button 1: Points the finger, begins drawing a line in 3d space.
- While button 1 is down, how much the trigger is depressed controls the width of the line, up to a point.
- button 4 will launch a basketball, holding the button adds power per frame, letting go fires it off.. I know I know, throwing motions.. soon.
- Left Hand:
- Thumb stick moves the players body
- Button 2: Points the finger, Inserts a ‘move’ helper near the hand
- While button 2 is down, hold the trigger down to ‘MOVE’ the player body in 3d space. The movement isn’t 1:1, so go gentle to start.. but everyone I’ve tried it on has been moving this thing almost as fast as you can move your arm within 30 seconds, with minimal icky stomach feels.
- The icky stomach feels is one of the main things I would like some feedback on, so I can pursue or steer differently.
The main OVR menu scripts are in there, so if you manage to say, jump off of the world ‘by accident’ and look up at the things receding, you can hit the Right Shift key, and then Return to reset the scene.
Things I’m working on and experimenting with still
- UI stereoficiation, NGUI draws well, but need something better than shooting a ray out for interaction. So far I don’t like the feel of the sixsense UI bits. working on some things here.
- Networking. Multiple people in a small space drawing stuff, So far it works on lan, but who really has 2 rifts/hydras they can put in a room? I’ll figure it out!
- Fly mode for the hydra, throttle style movement, basics started, but needs some tweaking before it’ll not make people feel strange.
- 1:1 rotations, started, but I wanted some feedback on other stuff first!
- Holding both bumpers will allow you to scale the player height/viewpoint up based on how close/far the ‘hands’ are from each other. nearly works, just a little too fall-thru-the-floory for my tastes to leave it enabled in this version.
I do architectural interactive/animation/work for a paycheck (haha, I know right!), so I spend a lot of time working with the fundamental problem of how to get non-game people empowered to get around in 3d space. I love toying with 3d spatial navigation systems, so being able to immerse in space with the Rift and prototype with the responsiveness of the hydra has been a lot of fun for me this week. There’s a lot more to play with here!
Most of my position/rotation stuff is made for placing items in the world, but the hydra tuscany demo covers a decent portion of the basics of that idea already. Besides, it was fun to mess with the player positions/orients, so that’s what I’m playing with so far.
So, if by chance you made it this far into my ramblings, here is the link to download the test app and give it a go!: 2013-04-18-BuckHydraLines.zip
Remember those from grade school? good times. Well, I have been doing a lot of reading in my quest to scale up my dev from the ‘hacker’ style, which has led me to quite a few books, both To-Read, and Have-Read:
Recent reads that have been helpful:
Code Complete, Steve Mconnell – I’m in the middle of this book now, It’s pretty much exactly what I’ve been looking for. Readability is high, and it is filled with practical advice and links/notes leading to further reading in any of the topics that it has touched on. I’m finding it to be helpful even with basics as I expand my vocabulary and ease some of the problems of finding random solutions online.
Head First Design Patterns – very approachable book that describes a large number of recurring problems and possible solutions to software design issues. I found the book to be quite helpful as an introduction in some cases, and as a clarification and placing some vocabulary on to some other systems that I had been working with for a while now. It also adds in some descriptions of pitfalls that I hadn’t occurred ‘in the wild’ yet. Overall, Helpful information, odd formatting, glad to have a physical copy
Coders at Work: Reflections on the Craft of Programming, Peter Seibel – The book is a series of interviews with 15 high profile/ influential programmers. I found this book extremely interesting and finished it in several days. It gives an interesting perspective on the early days of programming, computing, and a ‘hacker’ mentality.
Coder to Developer: Tools and Strategies for delivering your Software – Started well, and the book has some useful thoughts in there mixed in between the old software advice.
3D Math primer for Graphics and Game Development – I’ve found the explanations to be useful for many things in this book, it has been more a reference-when-stuck type book for me than a read-thru however. The book has been helpful, but I find myself wishing I’d gotten the Physical book instead of the Kindle version.
Have any recommendations for me? let me know!
There is a bit of a void out there in terms of recent information regarding multi touch in unity, most of the legwork of development looks to have been done in 09/10, looking around the forums there are many more questions than answers, and all out of date. SO! Lets record some of my findings from building a quick turnaround multi platform product config tool. this is the multi touch side of things, there was a slight bit more to keep it all happy with iOS/Android, but for the most part, I adopted their conventions where I could and forced the multi touch into that mold.
First interesting problem is how to begin dev and testing of a multi touch interface when you don’t have one to use! (wee!) Fortunately there is a few fun tools built by others in the CV world that can get you up and running quickly. There is much information to be had looking around in the NUIGROUP forums based around building your own touch tables. My immediate timeline didn’t let me play with full tables (sad!), fortunately Seth Sandler had put together a nice DIY project involving a webcam, some cardboard, glass, and a sheet of vellum, run on the Community Core Vision package . That got me multi touch input with output to the TUIO protocol fairly easily!
Next, looking into how to handle the TUIO inside of unity, there are a few main packages that work well, (Among others found at : tuio.org) I ended up with two that got serious consideration, the Mindstorm tuio, and the UniTUIO by the guys at xtuio, the code and overall base of the mindstorm system looked more solid and cohesive, but due to the schedule I ended up with the unituio for expediency sake, because it converts the TUIO touches to unitys base input system, by way of building bridged Input class and overriding the get touch functions. There is some editing needed to bring it back up to date with the 3.5 changes, but its minor and the console error log should get you there.
Now! The fun part, things are starting to work, now we can get down to biz. (Almost) There are a few other things to take into account, tuio systems are not taken into account with the built in unity gui. Alternate ui systems that work fairly easily with them are EZGUI and NGUI, I went with NGUI, and have been very happy with the support response time, and solid performance. I also adapted the truly awesome FingerGestures to use the tuio inputs, so now I’ve got a framework that works on all platforms, and with a loader for finger gestures that checks for the presence of the tuio inputs, and loads those, or else, reverts to base mouse input as needed. future: combine the two inputs so that there is no manual work on export!
One more: depending on the touch screens you’re using, you may need to convert windows 7 touches to tuio, I did this with the Touch2Tuio program. built in Windows 7 touches are functional for 1 touch, and the pinch to zoom gesture gets converted to the scroll wheel by default in unity 3.5 (And I believe it started working in 3.0 or 3.4.. the forum is vague on this.)
Performance wise, it still misses entirely too many touches for my peace of mind as a long term solution, but it worked quickly, and got us through the initial conference expo, and was a success! so work will continue, and it will get better and bigger and faster and more awesomeness. or something. I don’t know that I’d suggest this system for a long running exhibit, but it worked fine in the 3-5 day range for me.
As demo’d in this video: OR3D tool walk thru the video is a bit on the older side, but it shows the IK fairly well.
the system is fairly simple rig that attaches as a script to the IK End Effector, and parses up the hierarchy collecting any bones that are tagged with a certain empty class (In my case called an EndJoint).
I collect bone joints at runtime after combining and configuring the hierarchy, and pass them into the CCD IK function as an array of unity transforms. the bulk of the code is taken almost directly from Jeff Lander’s game developer article describing the Cyclic Coordinate Decent (CCD) inverse kinematics algorithm, written in 1998,
the port can be downloaded from here: IKCCD2D.zip
Here is a Revit -> 3dsmax -> Unity project that we worked on earlier this year.
the videos below are captured in HD in unity directly from some premade itween paths
Basic workflow was to export out the WorkSets from Revit as individual FBX files
Import them into 3dsmax with the file link manager, usually set to combine objects by material
from there run a set of scripts that collapsed / welded / AutoSmoothed / split meshes that were too large (65k verts is a pain) cleans up the layers, replaces materials, and exports the now unity friendly layers into different FBX’s for unitys consumption
Inside unity there are a few small steps needed, done mostly with AssetPostProcessors that hook up any external lightmaps that we generated in max, replace/reconnect any materials that are in our standard material library, and attach any components that are supplied in the max user properties fields.
bake the lighting to suit, (In this case, just give us some AO and Sunlight since we needed to be able to turn around revit design changes -> working webplayer overnight with minimal user babysitting.)
this workflow allowed the building owner to ‘walk’ around the design very rapidly after it was changed.
There are a few more captures up at http://vimeo.com/vsai/videos/ for anyone interested.
I had to integrate a small set of third party code that calls down to some win32 dllimport bits, to do this I needed to be able to work with some pointers. which in c# need to be declared in an unsafe function, so that it knows to be unmanaged.
Slightly awkward, but it turns out that there is an -unsafe compiler option that needs to be turned on in unity/mono, and that you can do this very easily on a per project basis by adding a file ‘smcs.rsp’ to the root of your assets dir, and in there you can add the text ‘-unsafe’ to add this to the compiler commandline.
Other apparent uses stated in the threads are setting project wide defines and pragmas
here’s a little google reader bundle of (currently) 70 tech art blogs, mostly taken from this thread: http://tech-artists.org/forum/showthread.php?t=1514
and thrown into an easy to follow group at: Google Reader Bundle: Tech Arts 2011
for anyone else thats deeply into this google reader bug, but doesn’t want to add them all by hand.
A little Art break, here’s a few shots from a recent set of renderings done for a minor league baseball stadium.
I’ve been working on this tool for a while now, here’s a quick captured demo of some of the features of my hospital room design application in action.
here’s a current video of my OR/ICU design project, created in unity3d, with fully configurable parametric models rigged with both IK and FK, and a tool to quickly draw out any layout of walls, windows and doors to spatially visualize the placement and reach of equipment needed in an operating room space.
This is a full featured application, including but not nearly limited to:
- Full file save / load for easy sharing of designs within the staff
- Easy to navigate tab based menu for selecting models and assets to insert
- all models will give some sort of visual feedback that they can be controlled by the user; ex: some arms will highlight in blue to notify the user that they are rotatable; some pieces will highlight in green to notify that they are movable; some in red to notify that they are control points that allow you to drag or re-position the entire hierarchy chain of the selected boom.
- many models have custom logic built in so that they move and rotate within their engineering limits (As best as we could get them in some cases, as some manufacturers were less than forthcoming with information and model details!)
- Customizable system to draw out and place walls, with orthographic or free line drawing, snaps, and dimensions integrated
- multiple wall styles, to quickly change and prototype room designs with windows, peek thrus, walls, headers, and soffits
- fully configurable ‘Hero’ equipment, developed from engineering models, with tolerances for how they can move and be configured together
- rule sets integrated that control how these items may be modified, ex: changing a light head to a flattened light head, or adding shelves to a boom
- this ‘Hero’ equipment is generated at run time as needed, this allows us to quickly configure the various lengths and heights needed to show the flexibility of the equipment system, and turn the tool from a demonstrative sales tool, into a full on design tool
- helpers of various types to show range of motion, overlaps, and possible collisions
- tools to place staff within the scene, and change to their point of view, look around and see what the end user of the room will see when the full update to their room takes place.
- output to jpg in several sizes, depending on whether the user is trying to print, email, or target a presentation size image
- custom user configurable animation paths
- output to an mp4 video file of a fixed size to enable easy email and sharing of the designs with the customers, architects, and planners that are involved in the process