Lessons from a crisis: telecommuting has a bright future, if..

“New normal” might on the long run become a worthy candidate to “Buzzword of the Year”. After months of heavily disrupted “normal” life, populations, states, and economic actors are slowly getting back on their feet and wondering what is next. Social and professional activities are resuming with caution – or not ! – our leaders are taking guesses at how to rule our lives in the new context (and at how to pay the bill); here and there, companies big and small are thinking of new work organization models (and of ways to pay their bills as well). What we have learned over the past months is that we have an unsuspected capacity for quick and global change such as the one very much needed to moderate the effects of the climate crisis. What we do with that new knowledge is up to us all.

In the professional space, the world has had a chance to test at an unprecedented scale the idea of working from home. And the good news is that – when applicable – it works. In fact, it works so well that the range of benefits it carries are seen with new eyes; in a nutshell, savings on real estate and operations as well as an extended human resource pool for employers, savings and an increased satisfaction rate for employees and the less egocentric but so much more critical emission reduction of greenhouse gases.

It is hard to get figures regarding the impact of commuting on global warming. Aggregating various sources that give the share of transportation to GHG emission and the proportion of work/business related travel, I come up with a ballpark estimation of 5-6% of GHG due to going back and forth to work every day. Enough in my view for telecommuting to be taken seriously.

What is promising is that, unlike many other behavioral shifts, this new paradigm may well happen quickly and on a large scale. Recycling, cutting down on meat, acquiring an electric car are all examples of changes that are subject to conflicting interests: economy vs ecology, that sort of thing. Their adoption is thus sluggish and relies a lot on individual choices and sometimes triggers activist action.

Not telecommuting. No, Sir. The pros of telecommuting yield a win (employer) – win (employee) – win (planet) situation which should naturally push towards an accelerated adoption, even if the cons should not be underestimated and duly handled: loss of personal contact, difficulty of supervisions, unadapted home settings, etc..

In fact, large corporations have started to act. Facebook CEO Mark Zuckerberg recently announced that he expects half of his employees will be working remotely by 2030. Automotive constructor PSA Groupe has taken similar steps, accelerating the deployment started in 2019. Numbers of companies converting to the remote-first logic are growing, as reported in this crowdsourced list.

The current forecast for the US, according to Kate Lister (Global Workplace Analytics) is that “25-30% of the workforce will be working from home multiple days a week by end of 2021”. I have a feeling it could be much broader.

How all this pans out is a matter of choices, and technology will be key to the extension and success of the change; efficiency, comfort and satisfaction are all impacted by the quality of working conditions.

The telecommuter’s toolbox

Come to think of it, working remotely – be it from home or from the office – has been an ongoing process for some time. And the limits of what can be done have always been set by the lack of technology. Communication means are essential. Remember a world with no email or internet… most of you probably do not ? Even in those tech-less or tech-little times, there were people working remotely using what they had, i.e. phones and snail mail. It was slow, inefficient and uncomfortable, but working remotely is nothing new. What can and cannot be done is a matter of technology, that’s all.

Where do we stand today? Screen-to-screen meetings are replacing head-to-head ones with more and more success: conferencing software is making progress, and has done even more so during the lock-downs. Audio and video work correctly (most of the time for most of us). Simple documents with text, images and video can be shared. You can even collaborate on them in real time, i.e. work together synchronously. As long as a limited amount of data needs to transit from one location to another, telecommuting in teams is efficiently handled by today’s software. And the best is yet to come with VR video conferencing.

Everything gets more complicated when sharing and collaboration are required on very large data sets, such as those found in my field of work: computer assisted engineering (CAE). Engineers run simulations to predict things such as the behavior of mechanical components or the flow of air around buildings, how a chunk of metal will become a forged part, etc… In fact almost anything useful that can be put into equations ends up being simulated on computers. These simulations produce results in very large amounts: tens or hundreds of gigabytes are a common unit of measure. We are very far from a Powerpoint presentation or a PDF report.

Remote CAE: we were on LANs before WANs

When I started my career 25 years ago at Transvalor, it seemed that we had a head start when it comes to working remotely – or at least from the server room to our workstations at that time. We had developed an innovative parallel solver that would run on high-end and large machines (such as the IBM SP1 and SP2).

Post-processing results was quite straightforward: working on a LAN, it was more or less comfortable to mount a remote disk on our workstations and use our analysis software as we did on local files. There was no real challenge, albeit making sure our company invested in Gigabit Ethernet which was being released at the same time. Loading the small models of that time from the room next door was not exactly like opening them from our workstations, but it was comfortable enough.

Quite surprisingly, in today’s context of larger models and WAN networks, shared disks are still a choice for many CAE engineers. The other popular solution is to open a remote desktop session that will send a stream of images from your remote computer onto your local screen. Both solutions are in essence very different and highlight the two ways of handling and visualizing CAE results stored on a remote server.

From a shared disk, data travels from the server to the local workstation (client) where all the heavy-lifting is done : data processing, display model generation and rendering to the screen – the client resources are being used to their full extent. The price to pay is the time to download the necessary data. Even considering an ideal WAN bandwidth of 1 Gbit/s (i.e. 125 MB/s) , the time to load a typical model for visualization is counted in seconds (e.g. 8.5 seconds for 1 million nodes, 5 million tetrahedrons, and a stress result mapped on element nodes).

When remote-desktoping, the client resources are hardly used. All that is left is the screen which constantly receives streamed images of the data being processed on the server. Working with a comfortable wide screen (3840 x 2160) and TrueColor, one image weighs ~33Mb. Considering a 60 Hz refresh rate – i.e. 60 images per second – the required bandwidth to work with the same visual quality would be around 2 GB/s. Forget it. Obviously, in real life there is image compression, caching and other optimization techniques going on to provide the required features with most comfort. And this is the price to pay in this case: a loss of visual quality which is frustrating given the great graphics card available on the workstation you are visualizing on. A word must also be said about loss of comfort due to various factors such as lag (that split second required for the model to react to your mouse) and stream stability (those 60 images per second need to be provided at constant rate, which is hardly ever the case).

No matter which of these solutions is chosen, there are other limitations and caveats than just visual quality and operation comfort. Take security for example: exposing a corporate shared folder to a private uncontrolled home office is probably not a great idea. The same applies maybe even more so when a remote desktop session is open: the entire user account finds itself in a vulnerable situation. Also, there is the hassle of installing and updating software on the remote CAE engineer’s home hardware, be it the engineering applications to work on a shared disk, or the remote desktop environment to allow them to open sessions on their work account.

There is obviously room for improving the working context of telecommuting CAE engineers: we are looking for a blend of enhancements on visual quality, the engineer’s comfort, the ease of deployment and maintenance, security and of course, cost.

The “current normal” remote CAE

In a world used to access information stored on “a cloud” (i.e. somewhere and anywhere) for personal and professional reasons, it seems awkward that such a technical trade as CAE has not fully picked up with the trend. Most of us are using our smartphones or laptops to do anything from sharing pictures and videos to personal or professional banking. Remote access to data and the ability to work on it has become so natural that it is hard to imagine the world “before”.

In the CAE space, resistance to change is at work, we are still clinging on the “old normal”.  But changing is easier than we thought – now we know. Even before the nudge given by the lock-downs, cloud-based CAE was increasing its footprint as a deployment model.

More and more offerings are on the market either from well-established editors who were born in the desktop era, or from new cloud-only companies.

A precursor in this field, Munich-based Simscale provides since 2012 a full set of CFD and FEA tools in the cloud on a subscription-based model and is considered by many the market leader when it comes to cloud-only CAE simulation providers.

Others are following the path, all over the world: Shanghai-based Simright  provides an FEA static analysis simulator and various others cloud-based tools; from Argentina, Caeplex is a “really-easy web-based platform for performing thermo-mechanical analysis right from your browser”; in Stockholm, IngridCloud provides “smart wind simulations” for various applications; more to the South, in Milano, Conself offers CFD and FEA simulation “on the cloud”; other members of this small and growing group of “new normal” companies include Simulaton, WeStatix, Airshaper, Sim4Design, Simularge and many more..

Among the historical providers, it seems everyone has some sort of cloud-based offering or is ready to release one. Dassault Systèmes has its “3D experience on the Cloud” platform, Ansys has developed “Ansys Cloud” with MS Azure, Autodesk offers “Fusion 360”, HBM Prenscia nCode’s recently released “Aqira” a web-based platform for analysis and simulation,  Altair offers HPC and a new platform named “Altair One” is due any day,…

All these providers, including also Transvalor, Siemens and Hexagon MSC, have cloud-based HPC offerings which can be proposed either directly to their customers or through scientific simulation platforms like Penguin Computing,  The Uber Cloud or Rescale. These companies offer end-users a totally flexible environment to cater for their CAE simulation needs: a wide variety of solvers and an equally wide variety of cloud providers and hardware to select the optimal setup to get the job done. Take a look at the supported software on Rescale and you will understand that the days of on-premise and licensing nightmares are counted. Such platforms are redesigning the market, providing the all-in-one technically and commercially flexible service to end-users.

The “new normal” remote CAE: browser-based WebGL visualization or bust

Whatever type of remote computing is taking place, you will find that visualization of CAE results relies on one of the old-time technologies we discussed above: server-side rendering and subsequent pixel streaming to a “dumb” client à la remote desktop (RDP), or streaming all the data to the client who takes care of generating a display model and rendering it, of navigation and of interaction… “à la shared folders”.

Although RDP has not undergone any major revolution, it is worth noting all the optimization and enhancement efforts as those illustrated by NICE DCV (Desktop Cloud Visualization) now sold on AWS. Handling of lag and quality has certainly improved but still seems to be of some concern even on today’s increased WAN bandwidths. Furthermore, the security question remains open: sessions are being opened from remote uncontrolled locations.

But certainly, the most compelling argument to avoid RDP when possible, is the shear cost of it. Indeed, the server needs a GPU to render, and this increases the hardware cost considerably. NV6 is the cheapest GPU instance for remote visualization on Azure: it is tagged at $1.092/hour whereas a GPU-less like D2as v4 is appropriate for client-side rendering and costs $0.096/hour. Now you know…

Having said this, server-side rendering may be the only choice in some cases: very (very) large models, under-par connections in terms of bandwidth or stability, etc…

On the contrary, client-side rendering has been gaining speed ever since WebGL became a standard and delivered on all web browsers, around 2015. Shared folders are still used, but WebGL has paved the way for a more subtle approach. When working on desktop applications on CAE results from a shared folder, the entire model needs to be transferred from the server to allow the client to generate the display model and render it. As seen previously, the amount of data is usually considerable and makes the user experience tedious.

WebGL has enabled a major change by giving any browser the ability to render a 3D object, very much what OpenGL does for desktop applications. This means that opening the result file, extracting the data, and generating the display model can be performed on the server. Instead of transferring all the geometry and results of the model, the amount of data can be limited to the compressed and optimized textured triangles that make up the display model. If the transfer involves progressive streaming, this technology is called “3D object progressive streaming”. A more detailed discussion by my friend and Ceetron CTO Fredrik Viken can be found on this page.

And what the heck, get a taste of WebGL for yourself (yes, the model is interactive):

This innovative approach has literally removed the drawback of client-side rendering, leaving only its advantages: comfort, visual quality, minimal security risk and reduced hardware cost. Obviously, such arguments have convinced software vendors and end-users with development capacity, and more and more Web applications dedicated to CAE result processing are being released than ever before.

Although it’s sometimes hard to tell who is using what technology, it’s pretty safe to say that these applications provide WebGL CAE visualization: Ansys Cloud, nCode Aqira, some of Autodesk Fusion 360, and naturally, the cloud-only providers Simscale, Caeplex, Simright, WeStatix, Conself, Airshaper, Sim4Design. Again, this list may be wrong, and I don’t want to hurt anyone’s feelings. Drop me a line if I missed out on your company or misinterpreted your technology – I will be glad to edit the blog.

Migrating CAE visualization to the “new normal”

Moving from an on-premise LAN environment to a remote WAN-supported one is a strategic move. Depending on development capacity and expected impact, the choice will ultimately boil down to server-side or client-side rendering. And it is all about balancing the cost of implementation and the expected benefits each approach provides.

Moving a traditional desktop CAE user interface to server-side rendering involves zero (when using RDP) or very limited cost (if a pixel-streaming front-end web app is developed). This reduced investment may explain why the solution remains popular throughout HPC platforms and vendor “cloud” offerings.

WebGL client-side rendering is another beast. New code needs to be written to optimize the 3D object that will be streamed, and the front-end user interface is a completely new application that uses Web UI toolkits such as React or Angular. According to Fredrik Viken, our CTO at Ceetron “developing a CAE visualization framework for client-side rendering using WebGL and the user interface web application is roughly a 10-15 man-year project.” Nothing you would dive into without pondering on the ROI.

Fortunately, resources can be mutualized and this seems to be how the market is moving. Even if in-house development has its advocates – preserve IP, have control – outsourcing such a task limits the risk and reduces the time to market. This is where CAE cloud visualization SDKs come into play, handling all the deep details of data compression and transmission and letting vendors focus on developing a custom UI on the client.

A quick yet probably quite complete survey of the SDK offerings leads to a limited number of vendors:

  • The open source platform VTK proposes vtkWeb and vtk.js as “tools for building visualization applications” and suggests consulting and development services to help build your application. Some examples are provided here and here (Paraview Glance).
  • At Ceetron, we offer Ceetron Cloud Components and applications based on them for end-users (see Ceetron Cloud for example, or the Ceetron Analyzer Cloud demo installation).

Five years into this specific market, our solutions are now recognized and embedded in commercial applications such as Ansys Cloud, Simscale or nCode Aqira. I find this Ansys video very interesting: it describes a hybrid approach in which the traditional desktop application is used to set up a simulation and upload it to the Cloud, and the web application allows monitoring the run and post-processing results.

Our technology is also key in setting up automated and visual simulation workflows in a cross-solver environment for partners such as SAP, BASF or DNVGL.

Technology will be key to the deployment telecommuting in the CAE space. The large amounts of data at play require special care when it comes to visualizing them remotely. WebGL has paved the way for a true cloud-based approach that combines security, visual quality, user comfort and a substantial reduction of operation cost.

Implementing WebGL visualization is not a straightforward task and is expensive, which has caused some lag in the community who still relies on old solutions such as remote desktops or shared folders. The trend is however towards upgrading these technologies, more often than not relying on specialized cloud-based visualization SDKs such as Ceetron Cloud Components.

Take care and enjoy the code,

Andres