
by Carsten Puls
Computer-aided design (CAD) has had unprecedented impact on how buildings are envisioned and constructed. From the early 1980s, where AutoCAD revolutionized 2-D drafting, to the present, where material modeling, earthquake simulation, and photo-realistic 3-D rendering are common, computer-based tools have affected every aspect of the architecture/engineering/construction (AEC) industry.
However, since these software tools were designed to run on relatively powerful computers complete with high-end graphics processor units (GPUs), access has been limited to the designers running those workstation-class machines. This status quo was accepted in the past, but new and increasing demands are driving the need for change.
The AEC industry is under continuing pressure to deliver projects faster with less resources and lower costs. At the same time, project leads, their vendors, and contractors are expected to deliver higher quality, better safety, and full compliance to rapidly changing codes, standards, and green building rating programs. Layered atop all these challenges is an increasing end-customer demand for ever-higher degrees of customization.
These somewhat conflicting demands are driving architects, engineers, contractors, tradespeople, and clients to find ways to work together more efficiently. Communication of ideas by visualizing with CAD tools has always been a key way to avoid surprises late in a project’s construction. However, when those tools can only be accessed by a relatively small set of users, effectiveness is restricted.
Enter the Cloud
Enabling any CAD software to be delivered from the Cloud to a browser on any device anywhere brings down a huge barrier to making digital design tools accessible to everyone involved in a construction project. Perhaps even more importantly, this model opens up a whole new way for team members to collaborate very closely with one another. From architect to journeyman, access to the same information and tools will help ensure the best possible outcome with much higher degrees of work-flow efficiency and flexibility.
Many firms have already adopted Cloud storage. Software vendors have also begun to offer Cloud storage as part of their building information modeling (BIM) solutions to enable sharing data more easily. However, this is only a first step as it still relies on users having workstations and local installs of the CAD software needed to view/edit the models. Further, when users have to constantly upload and download files from a local machine to and from the Cloud, a lot of time is wasted waiting on data transfer.

By running CAD software on virtual machines with powerful graphics processors in the Cloud, close to the data itself, the upload/download problem is eliminated. This means applications can take advantage of high bandwidth connections to the data (e.g. over 500 Mbps, compared to a typical office connection that may be two orders of magnitude less).
The user interface of the CAD software is streamed from the virtual machine to a standard browser on the end user’s device much like a video. Unlike a one-way video, users can also interact with and control the software. The bandwidth required for this is similar to what is needed to watch a video, but the result is a user experience that feels as if the software is installed locally, even if it is really on the other side of the country.
With this approach, even a smartphone or tablet with a browser can run a high-end CAD application backed by a powerful virtual machine. In many cases, these Cloud GPUs are even more powerful than a typical workstation. This means a rendering job can be kicked off from a smartphone with results visualized faster than on a workstation costing thousands.
Thinly disguised advertisement. If we are to believe the reports (mostly from IT service purveyors), Mr. Puls is right; companies of all kinds are flocking to the “cloud” with their data. Soon, their application software will be there too, if it isn’t already. The cloud is “the next big thing.”
What happened to all the concern for control of one’s data? What happened to concern about intellectual property? What happened to concern about bandwidth?
For myself, it isn’t enough to hear from the masters of Silicon Valley “Oh, those aren’t really problems. Don’t worry about that. Trust us, we know best.” Mr. Puls doesn’t even bother to address any such concerns. For him, as for most in the IT field, they aren’t even worth mentioning – at least, not in public.
Brian, the problems that you mention around data security are certainly valid concerns. These are issues for any IT environment where data is stored – cloud or on-prem/internal. Traditional approaches for storing data, in many cases can have more vulnerabilities than data kept in the cloud. For example, stories of lost/stolen laptops full of confidential data abound. And breaches of in-house IT environments (e.g. Sony) are common. Cloud storage and remote access to applications can actually address these security issues because no data ever resides on the device. For example, if a Chromebook used to access CAD applications and data in the Cloud is stolen — nothing will be compromised. In the end, though, any data is only as secure as the method used to access it: usernames and passwords. But this is true in any environment. The use of 2-factor authentication helps here as well and again applies to both cloud and on-prem environments. So in the end, I fully agree that the problems you’ve raised are indeed valid – but I would add that Cloud based approaches can actually offer distinct security advantages.
The technology is rapidly evolving biut it’s the non-technical issues that are the real challenge. For example, who owns the data both during and after the project? Where is the database of record? The quality of the data is also a major concern. For example, nomenclature needs to be standardized for all the participants in the project. Is it an “HVAC”, air conditioning unit” or AC1? Another example from a project I was involved in: The location was called Chantilly, Washing DC Area Office, the East Coast Campus and Virginia Office. Which is it and who decides? Who maintains the data? How granular do you need it to be? There are a lot of soft questions that if not answered will derail the best technology.