Edge Computing for Machine Vision – a windfall for efficient automation

The possibilities of Machine Vision have increased significantly in recent years thanks to powerful vision controllers. Today, this makes it possible to take images, perform measurements and carry out inspection tasks that many would hardly have dared to imagine just a few years ago. But this performance capacity has come at a price: the housings of the vision controllers have become much larger since then. So where to put software-based Machine Vision in cramped production facilities?

A secure corporate cloud might be the solution of choice. However – this is not yet practical in many companies for various reasons. But this does not take the issue off the table: because there is another option to combine security and full control with a vision controller outside the actual production facility:

Edge Computing.

Edge computing means real-time data processing at the edge of one’s network, but still close to the data source. Practically speaking: The processing does not take place in the production plant, but in the nearby control cabinet of the factory floor.

 

The Edge offers robust data transmission

This means that data does not leave the company – often an important security aspect. Also essential for automated production: if the processing were to take place in the cloud instead, there would be higher latencies as well as a heavy load on the data network. As engineers from University College Cork (Ireland) found out, the Edge reduces these noticeably compared to the cloud: depending on the test setup, the Edge outperformed the cloud by 67.7 to 99.4 %. At the same time, the Edge proved to be extremely robust. In none of the stress tests were the researchers able to interrupt the Edge communication. In the cloud, this succeeded – depending on the stress level – in 0.11 to 6.6 % of the (900,000 simulated) requests. (Source)

Transferred to the actual application, this means: Especially in rural regions, there is still a lack of corresponding bandwidth here, so that each network node that has to be traversed costs valuable milliseconds. This would translate into lower cycle times in many areas.

Machine Vision using Edge Computing circumvents this hurdle and still allows decentralised processing. Another advantage that should not be underestimated is the security against internet failures: Since edge solutions in Machine Vision are usually connected by cable (wireless communication would also be conceivable with little loss of time), such failures or fluctuations do not affect the system.

 

Fast and reliable production

Finally, the Edge makes it possible for all image data to remain within the company at all times. This is an aspect that should not be underestimated, especially for test parts that are classified. Although a very high level of security is now also possible with cloud solutions, this is often still accompanied by costly measures for the encryption of data and transmission. Many companies therefore still shy away from this solution for good reason.

Edge computing is enjoying growing interest, but also for another reason: the need for real-time analyses is increasing throughout industry, but also in Machine Vision. Today’s intelligent systems are able, for example, to process complex quality inspections of complex parts in a few (milli)seconds. The system thus identifies bad parts at an early stage – and can sort them out directly for subsequent reworking, for example. Rejects and waste are thus drastically reduced. In contrast to the cloud, the Edge can easily keep up with these high cycle times; an important advantage in terms of efficiency.

 

Is Edge Computing displacing the cloud?

Clearly, at this point in time, Edge Computing can offer Machine Vision more reliable benefits with less effort than the Cloud. But this does not disqualify the latter for the automated industry.

Hybrid solutions that combine the best of edge and cloud computing are already conceivable: for example, in AI-based quality inspection, the edge can take over the time-critical inspection of the workpieces passing through with the help of pre-trained neural networks. A connected cloud can then take over the permanent documentation of all parts – a potentially important function for safeguarding any warranty claims. In addition, special cloud services could use the day’s image data after the end of production to further optimize the existing neural networks.

If the image data is too large or shows a scene that is too big, the Edge can pre-process this data and crop it, for example. This simple step can avoid costs in the medium term, because the smaller image files also reduce the storage requirements in the cloud. Cloud and Edge thus become a dynamic duo that passes the ball to each other.

 

Wide range of applications for Edge Computing in Machine Vision

Due to its properties, the Edge for Machine Vision is now used in a wide variety of areas. Thus, edge-connected camera systems are often found:

  • In security monitoring,
  • In the acquisition of various image-based data from IoT and IIoT,
  • In autonomous vehicles,
  • In documentation of industrially manufactured (partial) products,
  • In quality assurance of automated production.

In principle, all Machine Vision tasks can also be solved in the Edge – provided their processing is so complex that an extensive vision controller is necessary or the integrated solution is uneconomical or unsuitable. Otherwise, embedded vision could be the solution of choice – here the processing is integrated directly into the camera.

 

Edge computing in more and more vision systems

Increasing efficiency, higher requirements, limited space. There are many reasons for using an edge-based vision system; after all, the setup offers numerous advantages. This is one of the reasons why more and more production facilities are opting for Machine Vision via the Edge. Space constraints in the plant, which were originally the decisive factor, are increasingly taking a back seat and making way for efficient and safe automation.

Continue reading:

Share This