SYS — Systems Engineering Challenges
This note is the first in a series about Systems Engineering approaches to solving problems, ranging from product design to sales and marketing. As the first in the series, it is intended to describe the landscape, and at least some known issues. Follow on notes will dig deeper into other areas.
Looking Back
I ran my first circuit simulation on a multi-user Motorola 68000 workstation — and promptly found bugs in the models used - which I fixed. But I lost trust in the simulator, so I moved to paper and pencil to trace out expected analogue and digital behaviour from power supplies to memories and clocks. This eventually got really messy, and I needed to break things down — that need to break things down and partition my signals has helped me every since.
Later on, I got upgraded to my trusty SUN workstation to simulate/test filters for edge detection, and Compaq 386 to run FPGA place / route and timing checks — for the same product. There was no tool or framework that connected the dots across the domains — you had to work across domains with the environment for each domain.
In the mid 90’s I was lucky enough to lead the Fujitsu laptop development team in the US, with an amazing, inspiring and diverse team — unified around creating the best PC laptops. We shipped without fans and always topped the benchmarks in our CPU class. To do that, we worked across domains — EE, ME, Embedded Code, Mfring, Supply Chains — to ensure on-time delivery and fast cycles. By the time the band broke up, we had development cycle times down to 7 months from whiteboard to dock. I even presented at the local IEEE chapter on rapid product development. [1]
In 2000 I started a company focused on electronic systems synthesis with a vision to build better systems tools. We were too far ahead of the market, and ended up narrowing focus on FPGA / PCB design interactions [2]. The technology ended up with a major FPGA company.
Model Based Systems Engineering and SysML
Fast forward, and today Model Based Systems Engineering is getting a lot of buzz. There are standards and self-certifying institutes and a variety of toolsets around SysML in particular. But, the evidence says these tools are more suited for requirements traceability and partitioned design in a “buy instead of make” business model, supporting traceability for liability management as well as government contracting.
In spite of the promotion, these tools do not as-yet enable net-faster cross-domain work, especially where a tightly integrated team on tight deadlines need to work across mathematics, code, boot management, cyber security, sensor interfaces, hardware, manufacturing, support and recycling. (The act of system partitioning, however, is valid.)
SysML methods providing a method of documenting the design, and the beginnings of a framework, with ongoing iteration and updates of the models. But SysML, as is, does not solve the problem. As of today, code generation works only with auto-generated Java for simulation, which is not always functionally complete, performant or cycle accurate :-)
· SysML is derived from UML thinking, and retains its value for documentation graphics, but has not really gone beyond interface-specified designs for component connections in a systems model.
· Code import is not in SysML supported, making this approach wasteful where code is a key part of the product. (And which product does not have code?) It makes the system model error-prone, likely to be out of sync without a concerted effort at grooming and manual back annotation. It is essentially not useable for rapid development or decision making.
Essentially, SysML provides traceability of requirements, and speeds up the review processes for a contract and buy decision, or formalizes methods in case of litigation defense, but does not enable engineering work to be done faster.
This huge hole needs to be plugged to ensure complex systems can be defined, developed, deployed and supported in the future.
Note that the need for systems thinking remains — in a way that enables leverage without a heavy operational burden, compromised speed and sub-optimal economic returns.
Looking at Reality
Today’s Engineers use models and frameworks from open source, then MathLab or Octave for AI modeling and development — a key workload for engineers in AI/ML — and not supported in SysML.
ECAD models depend on power supply integrity, signal integrity and logic, none of which is viable in SysML. That results in 3 more modeling environments; but in the ECAD case they can be pulled together with Cadence SKILL or Mentor (Siemens) AMPL and then interfaced to other toolsets — which is what we did in 2000 ~ 2005 in my first company.
Logic design can be modeled in SystemC or Verilog for example, and embedded code can now be “run” in containers that drive outputs of the electrical models directly. There has been significant improvement in capabilities over the past 2 decades.
Mechanical Engineers need physical form factor models as well as thermal models on which to build cost, cabling, airflow, thermal dissipation and assembly sequence models as examples. These tools are generally broken into 2 camps — ProE and Solidworks, and their integrations. Both these packages support basic integration with key ECAD packages. A quick check with friends shows a dominance of Solidworks driven by pricing, ease of use and capability.
Vibrational, chemical and biological models are not easily integrated into anything; these may also need to be MathLab driven and based on fundamental physics / chemistry. Mechanical models would need vibrational and temperature models derived from mechanical mass and materials science models. (Chemical-bond vibrational models are a completely different game, and tie in to materials analysis, spectroscopy, drug-discovery and more; it is an engrossing area that I encourage the reader to dig into.)
None of these models should be double entered into a SysML modeling/simulation; the costs of doing so are too high and the risk of errors also too high. The models themselves become core IP of the company need to be protected the same way source code is. Maintaining the models also enables ongoing modeling with new information, allowing informed roadmap decisions for the future.
How each company implements these practices eventually becomes part of the company DNA on the technical side, supporting operational excellence and differentiation. It not transferable, but it is valuable. Adversaries should not have access to this knowledge + methodology.
Don’t Forget About Manufacturing
One of the big issues in today’s world is the ongoing separation of design and manufacturing. As the cost of manufacturing equipment continues to come down it is my opinion that manufacturing should be in-house for as long as possible. The reasons for this is brutally simple; if you are truly innovating, you will need to do something new in manufacturing, whether in process, materials, assembly, test or packaging — or all the above. And each one of these options ties back to design and market choices you get to make.
Taking your hard-earned funds to pay someone to take away knowledge and flexibility you should have is worth thinking about. Think carefully about what to outsource and what to keep internally. There is a balance, which may change over time.
What Next?
This mix of tools, programming languages, interfaces and capabilities make it hard to create full systems modeling capabilities for most companies. It is often easier to just build a sample and test things out. However, as the cost and complexity of systems go up along with the risk of failure, a cohesive approach to this problem is needed.
To be continued …
Fun References from the past:
[1] IEEE link
https://site.ieee.org/scv-tems/ems-scv-meetings-sep-1997-jun-1998/
[2] Xilinx application note on early PAI technology
https://www.xilinx.com/support/documentation/white_papers/wp174.pdf