CAE and Docker: a promising combination

Contents:

Note: To avoid information congestion, I have split my latest findings into two articles. You may be interested in this one, too.

Preamble: I live by the dictum that "Succesful engineers don't rely on GUIs". Others may fancy using a pointer device for 8 hours a day, applying the same repetitive operations to parts which look like those of yesterday, which again look familiar ad nauseam.

I, on the other hand, believe that CAE automation is a powerful way to success.

Platform independence is another way to success

Remember when Java was the new black? It appears that now, the turn has come to Docker.

I cannot claim to be an authority on Docker. But my initial reading and experiments have convinced me that

  • Docker's layered data and execution model is clearer and more versatile than anything that I have ever seen.
  • Docker's small computational overhead and low level of data duplication may outperform many attempts to optimize non-GUI software natively on different platforms.
  • I have much more to learn and much more to understand before I - or my customer - really master the art of creating Docker jobs from dissimilar sources in an optimal way for the problem at hand. I consider the glass half full and intend to fill it from there.

As long as your calculation job takes character data (or a port data stream) as input and only

  • submits its output to a port, or
  • submits its output as a character stream, or
  • operates on files in a directory,

it is extremely simple to get the job up and running in Docker. Then you have become really platform independent.

How about performance overhead? The next paragraph shows that performance overhead due to Docker may actually be negative.

Rerunning a well-defined CAE job in three different environments

Simxon's number-cruncher can boot in native Ubuntu 18.04 LTS as well as in native Windows 10. This ability has enabled me to create this table of comparable runtimes:

Runtimes in mm:ss Native Docker
Ubuntu 13:29 11:17
Windows   14:21

(the job is the same as the one described in this companion article)

The performance gain when moving from native Ubuntu to Docker Ubuntu is somewhat puzzling. It may be due to the fact that the disk common to both operating systems (and the one used for the job) is NTFS-formatted. I imagine that Docker may define and apply an ext4-formatted in-memory cache exhibiting better disk I/O performance. If this is true, it means that NTFS on Ubuntu comes with an undocumented performance overhead.

Further analysis of this initial performance study may provide further insight. Right now, I can only conclude that you shall not expect any performance penalty if you apply Docker for a non-GUI CAE task.

The Dockerfile

Instructions on how to assemble contributions from various sources into a setup which will run your job are specified in a so-called Dockerfile. The Dockerfile for the performance study above looks like this:

FROM quay.io/tianyikillua/code_aster
VOLUME /app
WORKDIR /app
RUN ln -s /home/aster/aster/bin/* /home/aster/aster/public/gmsh-3.0.6-Linux64/bin/* /usr/bin
RUN apt-get update
RUN apt-get -y install software-properties-common
RUN apt-get -y install libhdf5-dev
RUN ln -s /usr/include/hdf5/serial/hdf5*.h /usr/include
RUN ln -s /usr/include/hdf5/serial/H5*.h /usr/include
RUN ln -s /usr/lib/x86_64-linux-gnu/hdf5/serial/*hdf5*.so /usr/lib
RUN apt-get -y install python-tables
RUN apt-get -y install python-tables-lib
RUN apt-add-repository universe
RUN apt-get update
RUN apt-get -y install python-pip
RUN pip install --upgrade pip
RUN pip install openturns
RUN pip install matplotlib
RUN pip install pygmsh
RUN pip install meshio
RUN pip install --no-binary=h5py h5py

The initial "FROM"-command indicates that the setup begins with an existing contribution.Tianyi Li has kindly offered to the engineering community a Docker image with Code_Aster preinstalled on Ubuntu.

Passing CAE recipes without passing proprietary IP has never been easier

I foresee a future where extremely tailored CAE procedures are developed and exchanged under very inspiring circumstances. A future which has just begun.

But license-bound CAE must catch up in order to join the party

According to the website of the market leader in licensing software, Flexera, they have supported Docker since 2015. To nominally support and to work well may, however, be two different things. Docker processes are very light-weight entities and have been designed to appear in swarms if the is a need for it (as there is in the companion study already referred to).

The basic business model for the big CAE players is of course to sell access to their software products. They can, however, only have an earning if they limit this access. A customer who wants to obtain a better grip of the stochastic nature of a given problem may not at all like the limitations which the big vendors need to impose in order to survive on the market. This constitutes a problem. The present article wraps itself up with a possible solution.

I am not alone. You needn't be, either

I firmly believe that conglomerates of loosely coupled specialist companies may serve customers better than highly commercial top-down initiatives from the big CAE players. In that spirit,

are these days working on a joint marketing message. The two Danish companies have already fathered the SimxonCARE offering. The findings described above are part of our product development process.

We are definitely open for business. If you see yourself as either a paying customer or a contributor, do not hesitate to make contact.

Creative Commons License
Website for the defunct company Simxon by Kim Ravn-Jensen is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Permissions beyond the scope of this license may be available at kim@ravn-jensen.dk.