Man 488642341 5e2768253e6f4

Edge Computing: Old is New Yet Again

Jan. 21, 2020
The new edge has a bit in common with the client/server architectures of old.

The current discussion on edge computing is reminiscent of discussions in the 1970s and 1980s relative to thin vs. thick clients and mainframes or discussions the 1990s on centralized or distributed computing (see figure).  One view was that users should have thin clients that merely interface with the mainframes, letting the mainframe perform all of the computations. 

The opposing view was that users should have thick clients so that the mainframe would not get overloaded and that the “answers” would be more quickly available to the person that needed the “answer,” namely the user. Subsequently, both approaches were implemented to meet the requirements of the environment.

In some cases, the thick client (the PC) was the best solution while thin clients were used where restricting data access was important or where a common “answer” was important for all users.  On top of that, with the coming of high-performance workstations, hybrid solutions emerged.

Edge computing appears to be going in the same direction, in that different architectures will emerge and be implemented and optimized for the operational environment. However, for edge computing, the environment is somewhat more complicated in that it now encompasses both the system hardware and software architecture as well as the network architecture.

The overall implications will be similar to the thin vs. thick, distributed vs. centralized debates. Namely that edge computing will move away from the general data servers seen in cloud data centers, becoming specialized data servers for the applications they support, delivering fast answers to data-lite issues. Likewise, application software will be developed around the process-specific servers and tune to the market segment they serve, while networks will be tuned to the data and security requirements of the functions being implemented.

Musical Chairs

What does all of the mean to the marketplace and to the companies that compete for the hardware, software, and network solutions? One possibility is those that appear to be in the lead today might not be the leaders in the future. For example, Intel and AMD are considered the leaders for data-center servers both in the cloud and in edge data centers. Will they hold that position or will folks like Qualcomm, Nvidia, Samsung, Hisense, MediaTek, or some currently unknown company evolve into the leaders?

Qualcomm and Nvidia are known for their mobile RF solutions and graphics processor, respectively.  Yet Qualcomm has an ever-increasing focus on AI and delivering solutions that are efficient and scalable processors in the edge environment, while Nvidia is already making inroads into edge computing with its EGX Platform.

Maybe we haven’t even heard of the future leaders in edge computing.  According to crunchbase, U.S. venture funding for AI/ML-related companies has gone from less than $100M (less than 50 deals) in 2008 to almost $5,000M (almost 750 deals) in 2017. Likewise, according to Lux Research, AI venture funding in China has gone from less than $100M in 2014 to $3,000M in 2018.

In summary, it would seem highly probably that what we think is the best solution today will not be the solution of the future. Would it be worthwhile to look inside some of the newest edge computing/IoT/5G devices and see what’s inside them?

Ted Scardamalia is a Co-Founder and President of MSW Analytics Inc.

About the Author

Ted Scardamalia | Co-Founder & President, MSW Analytics Inc.

Ted Scardamalia’s background ranges from being a Senior Manager in a Fortune 50 company to raising $15.6M in Venture Capital for an eight-person startup.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!