TECHNOLOGY SUMMARY

simpleXecutive Methodology’s patented modeling tools simulate hardware and software performance on embedded systems, prior to writing any code. SIMPLEX accurately synthesizes how a system will behave if built as specified in the model. This enables software engineers, architects and designers to deliver sophisticated real-time systems that meet requirements, at lower costs, on schedule and with greatly reduced risk. We encourage systematically testing and tuning overall system design until optimal performance is achieved. DARPA selected the GNU Radio ATSC Flow Graph to demonstrate our methodology where we improved performance by over 28%, without changing any of the application code or data.

Additionally, our table driven simpleXecutive scheduling and dispatching software adaptively manages the hazard free execution of these components in the architecture as specified in the model. The simpleXecutive control logic, internal integrity checking and management, scheduling and dispatching are the same as embedded in the SIMPLEX system simulation engine. System integrity is correct by design. The same REAL® System model driving the SIMPLEX simulations is automatically complied into Agenda tables to drive the simpleXecutive control software.

In a very real sense, SIMPLEX simulations have synthesized integration. We can’t say all integration problems and issues are avoided but we can say that their number and severity have been greatly reduced.

Integration doesn't need to consume half the schedule and budget; you can gain control over schedule and budget.

WHAT WE DO

  • We model the system performance before writing one line of code
  • We evolve a system design at the system level by iteratively adding system level details
  • We test overall system design from the beginning
  • We test the system design to evaluate system performance
  • We purposely stress a system design to failure to understand sensitivities and limitations
  • We systematically change the design to evaluate the performance consequences
  • We determine the hardware configuration based on the application needs
  • We always assess impacts of individual design changes at the system level
  • We test the system design to evaluate system performance

WHAT WE DON'T

  • We don't look at the components individually
  • We don't apply a piecemeal approach to system design
  • We don't demonstrate selected functions operating with actual code
  • We don't design components and then force them to integrate
  • We don't wait for integration errors to expose the consequences of changes