1bit
Active member
Somebody may find this interesting. There are significant shifts in play with large companies' attitudes toward proprietary hardware and software methods. The user community is demanding open hardware and software. According to the risk model, confidence is derived from wide use as well as direct testing. The idea of the relationship and integrity of the application as it relates to the entire system is also a significant issue for these people as it relates to system life cycles and preservation of system methods with technology improvement. The other issue is that system development is based on current system requirements and methods (traceability). EXXON and Lockheed (along with more than 100 large companies) are fore-runners in these thoughts. I have been in discussion regarding strategies for that open effort. Challenges in the adoption of that perspective remain; these challenges exist chiefly around commercial value models. Regulatory issues for an open system developed environment also represent a significant challenge. Quality testing is still required and code control is still mandatory. Reuse, while maintaining integrity is the user's objective as always.
I do not pretend that the system described int the attachment is suitable for anything but development at this point. The fact remains that several 100 million people are trained in this environment from schools and that level of comfort and ad hoc testing provides a basis for acceptance. Training verification is part of what that group is looking to address as well. Leverage of what the the users and developers already know is a good thing; even if that only helps facilitate learning and deployment Those things, combined with the fact that open source community remains relevant after 20 years of development is significant. The opengroup community regulary develops and uses defacto standards. This is the backdrop I look at when I consider the shift to a distributed cloud model for SOME types of information.
I feel that the IoT effort is needing to rearrange what is important; eventually, what it does will be more important than the novelty of connection path and data availability and security. There are things required in the internet backbone or segmentation that need to happen. Perhaps a newconcept for TCP/IP is required. For now, the idea of a corporate cloud is a good one. If they can get to a distributed cloud model so much the better.
In all likely-hood, things will move toward a Java virtual machine or some similar hardware abstraction for control processors,and that the controls vendors will morph into tool providers for migration assistance to a common set of control object definitions much like S88 and S95. The architecture is the important thing, and that what I show there is running with a JVM as compiled with a toolset capable of control object execution. Control vendors will likely want to provide intellectual property preservation by offering products that allow users to move from the legacy object to the new set of JVM objects compatible with open hardware. Our challenges are in how to apply this to our benefit. I hope this helps.
I do not pretend that the system described int the attachment is suitable for anything but development at this point. The fact remains that several 100 million people are trained in this environment from schools and that level of comfort and ad hoc testing provides a basis for acceptance. Training verification is part of what that group is looking to address as well. Leverage of what the the users and developers already know is a good thing; even if that only helps facilitate learning and deployment Those things, combined with the fact that open source community remains relevant after 20 years of development is significant. The opengroup community regulary develops and uses defacto standards. This is the backdrop I look at when I consider the shift to a distributed cloud model for SOME types of information.
I feel that the IoT effort is needing to rearrange what is important; eventually, what it does will be more important than the novelty of connection path and data availability and security. There are things required in the internet backbone or segmentation that need to happen. Perhaps a newconcept for TCP/IP is required. For now, the idea of a corporate cloud is a good one. If they can get to a distributed cloud model so much the better.
In all likely-hood, things will move toward a Java virtual machine or some similar hardware abstraction for control processors,and that the controls vendors will morph into tool providers for migration assistance to a common set of control object definitions much like S88 and S95. The architecture is the important thing, and that what I show there is running with a JVM as compiled with a toolset capable of control object execution. Control vendors will likely want to provide intellectual property preservation by offering products that allow users to move from the legacy object to the new set of JVM objects compatible with open hardware. Our challenges are in how to apply this to our benefit. I hope this helps.
Attachments
Last edited: