We've talked about using Meshmixer to design prosthetics and now have an interesting story from the Toshiba Stroke and Vascular Research Center in Buffalo, New York.
Dr. Ciprian Ionita and his team have developed a method to create 3D-printed vascular models (or "phantoms") using Polyjet printing technology from Stratasys. The polyjet process can create flexible objects that mimic the feeling of human tissue. Neurosurgeons are using these models for planning complex procedures such as repair of brain aneurysms.
The process begin with a CT scan of the patient's brain. Biomedical engineers extract the critical regions of the vascular (blood vessel) network as 3D surfaces. These surfaces are imported into Meshmixer, and are used as the basis for designing a printable model which the surgeon can inspect. The model can also be connected to pumps which mimic blood flow, and placed into a simulated surgical environment. These planning steps allow life-threatening complications to be identified before the patient is on the operating table.
In the video below, Dr. Adnan H. Siddiqui from the Jacobs Institute describes how one of these models was used to save a patient's life.
BIM, building information model or modeling, brings all parts of a building design together into one complete system; it's no longer a collection of unconnected parts. The plumbing exists in context with the electrical work and one can see if there are conflicts that will make the construction or renovation process more complicated.
With the Internet of Things (IoT) we're starting to see new devices for buildings like smart thermostats, talking fridges and lights that react to your presence. What happens when we add BIM to this and how does it help the day to day operations and management of the building? This is exactly what the Environment and Ergonomics team at Autodesk Research set out to discover with Project Dasher.
Project Dasher connects BIM with the IoT
What seems like a simple question kicked off a whole bunch of work. Many newly designed buildings will have a BIM. For an older building, like the Autodesk office in Toronto, the team had to create one. They learned about reality capture and laser scanned the building. This resulted in the Digital 210 King project.
They then set about creating a network of sensors and software to monitor things in the building like the movement of occupants in the building to adjust heating levels as people congregate and the position of the sun changes, amount of lighting relative to natural light entering the building and levels of energy usage.
Sensors to monitor building performance including, lighting, motion and carbon dioxide.
As you can imagine, this creates a lot of data for building operations people to deal with. The team developed novel ways to represent the data in context of the building. You can see heating laid overtop of the model as well as the paths of people moving through the building.
As occupants move through the building, their motion can be visualized with overlapping trails to highlight the busiest areas
Exploring the heat of a building
The following video shows how some of this works in real time.
Now that I have all this data what do I do with it?
At this point, the team can begin to learn from the data and apply it to other buildings for the ultimate sustainable design project. Designers could use the same tools that building operation people use to simulate the building before it is is built. They can try different sustainability techniques and technologies for optimal building performance.
You can read more about this exciting work on the Digital Environment page. If you liked this article, please share it with your sustainable design friends through the links below.
Have you ever wanted to view molecules in high-fidelity from the comfort of your own web browser? Well now you can thanks to the Bio/Nano group at Autodesk Research.
Merry Wang from the group explains that the Bio/Nano group is building on Autodesk's expertise as a toolmaker for designers of things like buildings, cars and roads. This group is making tools to design living things. They're starting with a tool to visualize complex data at the nanoscale.
In the image below, you can see a sample of the viewer displaying a Recombinant Hemoglobin molecule.
Users of the Molecule Viewer can bring in their own custom data or draw from the RCSB Protein Data Bank (PDB). Here's a view of the Crystal Structure of Human Fibrogen.
Here's an alternate view but instead of looking at the structure in Ribbon mode, we are looking at a combination of Ball & Stick with Surface display. The highlighted portion starting at the top left is Chain B.
Each display mode has a number of options such as chain, residue and bfactor as you can see below in this view of Crystal Structure of the full-length Human RAGE Extracellular Domain (aka 4lp5) - the RCSB PDB Molecule of the Month of June.
The UI is nicely laid out and allows you to explore the Chains, Residues and Atoms of the molecule quickly and easily.
Sounds exciting, right? Head on over to Autodesk Labs and try it out! You can also check out the Wet Lab Accelerator for assistance in running your tests.
Citeology is an interactive tool for visualizing relationships across research papers created by Justin Matejka, Tovi Grossman and George Fitzmaurice of the UI Group. Selecting any one of the 11,000 plus publications from CHI and UIST will show you its geneology; its parents (papers that it cites) and its children (papers that site it).
Beyond being helpful to the user interface community these graphs are beautiful. We have a wall size version of one graph in the Toronto Autodesk office.
The layout of the information is simple and effective. Across the horizontal axis is a listing of all the papers by year. As time progresses more papers have been published, much like our growing human population. Parent, or past papers are connected by blue lines while children, or future papers, are connected by red lines.
The lines drawn between papers are semi-transparent add build up to show multiple connections.
Similar to a word cloud, all the titles are displayed with the connected papers being shown in darker colors to stand out.
The complete tool shows some additional information and controls for refining your search results including:
shortest path between papers
number of children and parents to show
details about the active paper
Citeology uses research papers and it's interesting to think about what other kinds of relationships a tool like this could help to visualize:
Building on geneology, things like family trees, band memberships, and sports teams are likely candidates
Historical figures and events along with their triggers
Connections and dependencies between things in the Internet of Things
What would you use it for? Try Citeology and let us know what you think!
David Benjamin explains how his team at the Living worked on Björk's Black Lake display at the MoMA. The Living took on the challenge of using sound from the music to design the cinema room where the piece is playing. In this video you can hear him talk about it and see some of the tools he used including 3ds Max and Dynamo.
You can see the full piece at the MoMA in New York City - here's the trailer to whet your appetite.
The CHI conference showcases the very best advances in computer science, cognitive psychology, design, social science, human factors, artificial intelligence, graphics, visualization, multi-media design and more is approaching with Autodesk participating both as a proud sponsor and presenter. The theme for CHI 2015 is "Crossings": crossing borders, crossing boundaries, crossing disciplines, crossing people and technology, crossing past and future, crossing physical and digital, crossing art and science, … crossing you and me.
This year Autodesk Research has three papers receiving Honorable Mentions (the top 5% of all submissions):
Fraser Anderson, Tovi Grossman, Daniel Wigdor (Department of Computer Science, University of Toronto) and George Fitzmaurice look at ways to conceal your usage of mobile devices and stay connected without offending your co-workers.
Madeline Gannon (Carnegie Mellon University, Pittsburgh), Tovi Grossman and George Fitzmaurice look at skin based input through augmented reality for new design possibilities.
There has been a longstanding concern within HCI that even though we are accumulating great innovations in the field, we rarely see these innovations develop into products. Our panel brings together HCI researchers from academia and industry who have been directly involved in technology transfer of one or more HCI innovations. They will share their experiences around what it takes to transition an HCI innovation from the lab to the market, including issues around time commitment, funding, resources, and business expertise. More importantly, our panelists will discuss and debate the tensions that we (researchers) face in choosing design and evaluation methods that help us make an HCI research contribution versus what actually matters when we go to market.
Panelists:
Parmit K Chilana, Management Sciences, University of Waterloo, Waterloo, Canada
Mary P Czerwinski, Microsoft Research, Redmond, United States
Tovi Grossman, Autodesk Research, Toronto, Canada
Chris Harrison, Human-Computer Interaction Institute, Carnegie Mellon University, Pittsburgh, United States
Ranjitha Kumar, Computer Science, University of Illinois at Urbana-Champaign, Champaign, United States
Patrick Baudisch, Hasso Plattner Institute, Potsdam, Germany
Shumin Zhai, Research @ Google, Mountain View, United States
Digital 210 King is part of Project Dasher, a research project designed to study buildings as living organisms. The Autodesk Toronto office was laser scanned, creating a point cloud that could be used to create a Building Information Model (BIM).
To support the community in studying how buildings operate, the Autodesk Research team has provide the dataset for the building. Kai Kostack has worked with the point cloud data to create a very artistic look at the building which you can watch below.
What is the latest in state of the art baseball analytics?
Who said Autodesk was only about designing stuff?
How do you visualize and analyze mass amounts of video data?
Read on for answers to these pressing questions and more.
Video Lens is one of the freshest pieces of work from the User Interface group at Autodesk Research. Something really interesting is that this uses baseball as the foundation for this exploration into big data and analytics.
The Video Lens User Interface is Easy to Use and Provides Real Time Feedback
As we know from stories like Moneyball, sports analytics is a serious business and lots of data is captured. The team worked with about 8000 video clips and the metadata captured by the PITCHf/x toolset to quickly visualize what happens in the game of baseball and analyze specific plays. Video Lens had four design goals:
Maximize Information Desnity: Reveal as much as possible about the underllying data
Treat Events as the Primary Entity: Events are the important parts in the data; each data source (video clip) may have multiple events
Support Rapid Playback of Video Results: If you have to wait for things to happen the system is of lower value
Encourage Exploration of the Data: trying things out are low cost
Imagine you want to see all the knuckleball pitches thrown by a southpaw that were high and to the right. You would circle the target in the pitch zone UI, mark the pitch and pitcher type. Three quick steps in this interactive tool and all the appropriate video clips play back.
The Video Lens Strikezone User Interface Allows the User to Quickly Isolate Specific Pitches
Want to narrow that down to the team you're facing next week?
Autodesk Research The Video Lens Teams User Interface
What if you were a sports commentator and had to fill in one of those awkward moments where a streaker was being escorted off the field? You could go to the video Lens and talk about potential strategies with real time feedback to illustrate. Or maybe you just want to see how many balls are hit to the area of the field your streaker was spotted in and talk about the chances of him being hit by a live ball
The Video Lens Field User Interface
Intrigued? Here's a video showing the Video Lens system in action:
Taking this back to Autodesk and the possibilities for use in other tools, here are some possibilities:
TV broadcasters looking for B roll footage in their database of a rainy day at a famous landmark
Film editor sorting through possibilities in their storytelling
Connecting video recordings of your real work product stress testing (eg crash test dummies, force testing chairs, etc.) to your digital product design
We think Video Lens is a great toolset for so many things. Let us know what you think here on the blog or on our Video Lens Research page where you can also find more info and contact information.
By the way, if you happen to be in Hawaii around October 8th, please stop by the UIST Symposium where we'll be talking about Video Lens in person.
As a designer, it's good to create things that generate discussion. It's preferable that the dicsussion is positive and that what you have created is a deemed as useful and beneficial to the target audience.
The Autodesk Research Ergonomics Group wants to help you make things that are well received for their human factors. The Ergonomics Group aims to put the human at the centre of the design process. Whether you are creating something as large as a community, an office, house or factory, a vehicle, a handheld tool, a shoe or as small as a medical device that might correct a fractured bone, torn muscle or blocked artery, designing these things with human ergonomics in your toolset will help.
Considering the variety of scales that humans operate at, from very small with blocked arteries all the way up to very large when placed in a community, the Ergonomics Group is researching the navigation and visualization of multiscale datasets. In thinking about this scale, one example of the kind of things the group is looking at is called Splash. Splash helps to keep some representation of the dataset available and running in real-time so that you can always work in context.
The data used in the Splash example may not look exactly like a human. If you combine this framework with the model being pursued by the Parametric Human Project it may make more sense.
The Parametric Human Project brings together industry and academic experts to create a fully functional, data-driven, digital human model. Working from the inside outwards, project members have captured high resolution scans of bones and the tissues that cover and connect them.
Scanned Arm Muscles from the Parametric Human Project
Over time, this project aims to add the human biomechanics, so an arm moves like a human arm, and parametric controls so that you could tune your digital human model, intelligently interpolating between physically correct models to represent a variety of things like age or physical impairments. A doctor may use this to compare injuries of their patients to known datasets to choose the best course of recovery.
Taking this back to our chair example, as a designer, this research may help you to answer:
Is this digitally designed chair comfortable?
Comfortable is somewhat subjective but for a chair could include things like:
Does it prevent fatigue and support good posture?
Is it free of awkard pain points?
Are the arm rests positioned well or sufficiently adjustable?
Is it easy to get in and out of?
Could you write an exam to join a secret organization that supervises extraterrestrial lifeforms while sitting in this chair?
One of the best ways to determine that today is to build or print a physical prototype. As good as a solution as that it, it can be both time consuming and costly. Prototypes are often designed to test certain product qualities like so you may need multiple prototypes to test things like:
Is the chair strong enough to support an average adult?
Is this combination of foam and fabric aesthetically pleasing and comfortable?
As this works develops, we can imagine a future state where you could use your digital human to design the intial proportions of your chair and then place the digital human in the chair to test it against your comfortability measures.
The Parametric Human Project is welcoming new contributors who may help in a variety of ways - from doing research to providing equipment and funding - please join in if you can.