The Internet of Things and connected objects: the stakes ahead
By 2030, there will be 30 billion connected objects worldwide, including 244 million in France, according to estimates by ADEME (the French Agency for Ecological Transition) and Arcep (the country’s electronic communications authority). Some will be everyday objects, others will be used in professional applications.
The Internet of Things (IoT) is briming with development potential and exciting possibilities for homes and businesses. As it expands, the IoT also brings new innovative solutions for electrification. It will help industrial companies increase their productivity and help people reduce their energy consumption at home. And it is growing 15% to 20% a year across the board.
Let’s deep dive into the IoT concept, the prospects for harnessing electrification, and the issues and challenges surrounding it.
The Internet of Things: how does it work?
The IoT is a network of connected devices with built-in microprograms, sensors and connectors enabling them to interact with the Internet. Examples range from household appliances to electricity meters and on to cable drums.
The IoT, in a nutshell, makes things smart. They can collect data, process it onsite and share it online or with other devices to analyze it in more depth. Then they can take measures to improve operations or automate tasks.
When you transfer data online, you can build an intelligent ecosystem where you can use devices in more sensible and more modular ways. You can upgrade a home into a smart home, a city into a smart city and a grid into a smart grid. In France, with 35 million smart meters, electrification is among the sectors that have reached the most advanced stages of digitalization with the IoT. And it will reach even further as Enedis, which operates the country’s electricity distribution system, has announced plans to install 250,000 sensors throughout its grid over the next 5 years.
The possibilities on the operation side are opening up numerous opportunities. For example, home automation environments will be able to manage energy consumption. For connected objects to communicate effectively, however, they need specific systems. These include radio-wave modules, sensors, cellular routers and gateways, and they are all essential to manage data flows and tackle the related challenges.
IoT and innovative electrification solutions
The IoT is bringing in an array of electrification solutions that create value in homes and companies:
- Managing energy consumption: the IoT can help consumers keep an eye on their energy consumption and manage it more efficiently using real-time electricity and gas meter readings. Smart connected objects can also be programmed to switch off automatically when they are not being used, which also reduces energy costs.
- Monitoring equipment: companies can use the IoT to monitor their solar panels, wind turbines and other systems remotely, to make sure they are running properly and optimize their output.
- Storing energy: the IoT can also help to monitor and manage storage levels, and optimize battery charging-discharging cycles.
- Reducing costs: the IoT can also help to reduce operation and servicing costs by enabling predictive maintenance, shortening downtime, and optimizing supply chains and use of resources.
- Optimizing grid operation: the IoT does this by tracking demand for energy in real time and adjusting supply accordingly, which can help to reduce power production costs and optimize distribution.
Issues and challenges around connected products
There are several practical and economic issues and challenges surrounding IoT operation.
IoT communication
When you have objects scattered around the globe, the first challenge is to interconnect them. Some of them may be in city centers, others may be in out-of-the-way places that telecom networks barely reach. To tackle this challenge and improve scalability, Nexans uses a variety of communication protocols and teams up with telecom operators worldwide.
Then you must integrate the routers, sensors and other devices mentioned earlier. Three main notions come into play in IoT rollout:
- the reach of the equipment and connected objects you use;
- energy consumption;
- bandwidth requirements and capacity.
In other words, you must adapt the available resources to match the complexity of the infrastructure—and that infrastructure can span a local area, a country or the globe. That is why it is important to partner up with other experts, as Nexans started doing with Orange in 2020.
Cybersecurity for the IoT
Cybersecurity is as central to the IoT as its efficiency. The more connected objects, the greater the risk of cyberattacks, because the objects collect sensitive data and can provide hackers with a back door into a company’s information system. The entry point can be a computer as much as a connected object.
Even something as simple as a camera can be a way into the core system. A casino in London, for example, was hacked through an Internet-connected fish-tank thermometer linked to the rest of the system. Ironclad security protocols are an absolute must for the IoT: a device can be a risk however harmless it may seem.
The IoT business model
Large-scale IoT rollout is viable even when you factor in all the complexity associated with integration. It for instance provides several advantages in industrial production and supply chains:
- smoother goods flows and real-time monitoring and updates;
- more efficient collaboration between departments;
- better goods tracking and transit;
- swift and secure data collection;
- tighter control over stock.
Besides all of the above, customer service teams can respond faster, especially when they have to deal with delivery delays or other problems.
Ultracker: the Nexans solution to optimize supply chains
Here at Nexans, we have developed Ultracker, a pioneering digital solution to harness the possibilities in the data collected by IoT sensors, combined with artificial intelligence and cloud storage.
With this solution, our customer installers and utilities can:
- optimize their working capital and logistics flows;
- shrink their carbon footprint by shortening drum rotation cycles;
- reduce losses and prevent cable theft.
The IoT trackers embedded in our cable drums and transportation fleets, and our cable-related products, enable customers to track drum status more closely, see a clearer picture of their stock levels and supervise jobsites remotely. This cuts raw material and supply wastage.
Nexans’ IoT expertise, and the solutions we have set up with our partners enable cable system and cable life cycles management, range from delivery on site to measuring how much cable there is left on a drum before pick-up. A leading European electricity distributor that adopted Ultracker to monitor its cables via the IoT is saving over €1 million a year.
Artificial Intelligence (AI) has been around for a while. The first models date back to the 70s but these concepts remained theoretical until we were actually able to teach computers to think for themselves. Today, Artificial Intelligence is everywhere. It allows computers and cloud connected devices to reproduce human-related behaviours such as reasoning, planning and creativity. Artificial Intelligence is primarily dependent on the quantity of data it is given. This is where big data plays an active role. With the increased collection and analysis of digital data, big data and AI are now emerging as rich areas of opportunity for electrification professionals.
Electricity 4.0 : Big data and AI for smarter power management
Big data is a major trend in the energy industry. The Electrical network become smart grid due to Data collected from a variety of sources, such as smart meters, sensors, twin digital. Once stored, this data is an invaluable resource for the industry to make better decisions about energy production and consumption.
Electricity was deployed extensively in the late 19th century, which was the First Wave of Electrification from 1880 to 1920. This period saw the widespread adoption of electrical power in industry and the development of the first electrical grid. Then the Second Wave of Electrification took place between 1920 and 1950 with the expansion of the electrical grid into homes and the development of new electrical appliances such as refrigerators, washing machines… During the third wave of Electrification from 1980 to present, we have seen the growth of the digital revolution and the development of new technologies such as computers, the Internet, and mobile phones.
Today the fourth wave of electrification so-called Electricity 4.0 is characterized by the integration of digital technologies such as Artificial Intelligence (AI), the Internet of Things (IoT), and advanced data analytics into the electricity infrastructure.
The aim of Electricity 4.0 is to create a smarter, more efficient, and more sustainable electricity system that can respond to the fast changing demands (+20% by 2030, +40% by 2040).
Electricity 4.0 is expected to optimize the use of existing assets, integrate renewable energy sources into the grid, increase energy efficiency, reduce greenhouse gas emissions, improve grid stability, reduce costs for customers and provide more reliable and flexible energy services to customers.
Moreover, generative AI and adjacent models are changing the game. Indeed, support technology reaches a new level, application development time is reduced, and powerful capabilities are brought to non-technical users.
Just recently, we saw the buzz around ChatGPT and what it can achieve. For instance, if we ask the question “how do big data and AI impact electrification”, we have to admit ChatGPT answer might not be perfect but is still very impressive.
These technologies will definitely have an impact on the world of electrification. But AI is mainly dependent on the quantity and quality of the data that will be used to learn. Big data provides the storage and processing capabilities necessary to educate the AI by feeding it with a lot of information.
Machine Learning and AI are the winning combo to efficiently exploit big data. This involves identifying patterns via data mining and data science more generally.
Big data: The cloud has won
In the age of big data, the famous wave 2 of “move to cloud” announced by providers is underway and is accelerating. As a reminder, the first phase of a migration to the cloud is a discovery phase that allows to analyze the strengths and weaknesses of an infrastructure and to determine future needs.
The number of detractors is getting smaller every day, privacy and sovereignty issues are both solved by the strategic commitments of clouders and swept away by the ease of use… All sectors – banking, telecoms, insurance, etc. – are rapidly adopting cloud-hosted big data solutions.
The first paradigm shifts are appearing in the world of electrification, driven in particular by operators such as Total Energie or Schneider. We can also note the predominance of the estimated Azure services of Microsoft Vs Aws of Amazon in the field of public cloud related to big data.
Exploring the challenges of generative AI and Big Data in 2023
Generative AI promises to make 2023 one of the most exciting years for AI and, by extension, Big data!
Keep in mind that ChatGPT’s prowess is based on the net recorded in 2021, but, as with any new technology, we must always proceed with pragmatism and measurement, because the technology now presents many challenges:
- Ethics: what sovereignty for data? What protection for personal data? What commitment to transparency and readability by the players?
- Environment: AI and Big Data are a paradox in that they are both a solution for optimizing energy consumption and resource mobilization, but also a cause of this increase;
- Cybersecurity: AI and Big Data in the field of energy is largely based on measuring instruments, therefore on IOT, offering an ever increasing security surface;
- Business Model: if the value of AI in the energy field is no longer to be demonstrated, the business model associated with services is very complex. For example, if we take the residential segment, the Chat GP virtual assistant has made the buzz as has Amazon with the announcement of a massive layoff, including the Alexa division (Amazon’s virtual assistant), in the same week;
- Talents: the development of digital services requires the onboarding of excellent technical skills, but not only. It’s the entire operating model that needs to be rebuilt. The human dimension is one of the biggest challenges brought by AI and Big Data: attractiveness, meaning of work, conditions, etc.
Big Data analysis combined with artificial intelligence also involves various risks. Key concerns include unintended consequences of automated decision-making, increased risk of cyber-attacks due to reliance on technology, inaccurate predictions leading to poor decisions, over-reliance on algorithms instead of human judgement, lack of transparency in the development process, etc…
AI and big data for Nexans
As previously expressed, AI in the energy domain is most often carried by a phygital system, meaning software + hardware.
To this end, an important part of our work in terms of AI and big data concerns the implementation of learning based on neural networks. The latter’s role is to translate images or text from measuring instruments (thermometers, drones, etc.) into numbers. The aim of these approaches is to understand recurrences, date them, predict them and locate them. We are in the AI for grid sensing.
One of the important activities in the field of electrification is the monitoring of networks for all segments: generation, transmission, distribution and use of electricity in buildings and industries. this requires the development and implementation of sensors that measure electrical activity along the value chain.
This is already the case in developed economies at home or in industry with Smartmeters. High voltage transmission lines are also systematically monitored for temperature and voltage. Medium-voltage electricity distribution networks and the connection networks of distributed renewable energies are less frequently monitored.
It is therefore essential to obtain data on the entire electricity deployment chain.
A second important activity is the analysis of data in order to optimise products or systems. this is at the heart of artificial intelligence and Big data.
In technical terms, we mobilize the techniques developed essentially for the field of natural language processing with recurrent neural networks and more precisely convolutional neural networks. In other words, the technology stacks of ChatGPT & DALL-E.
A long-term energy transition
Big Data is a hot topic with huge implications for the energy sector. It is a powerful tool that can be used to improve the efficiency of energy systems, production and consumption. In addition, it can also be used to improve electrical networks and smart technology.
Thanks to Big Data, it is possible to explore various scenarios and objectives related to the energy transition. In particular, this technology makes it possible to analyze how different systems and supply sources are interconnected and how they could be optimized in the long term. Thus, it offers an invaluable perspective to achieve a certain autonomy in a long-term energy transition objective.
The 3S (smart, small & selectivity) are challenges for the years to come. Addressed in a disorganized way today, they will become the real challenges for AI applications tomorrow:
- Smart data: Understanding and monitoring local ecosystems
- Small data: Limit the use of energy-intensive big data
- Selectivity: optimize the resources needed.
Using digital twins to reduce carbon emissions
First mentioned in 1991 by David Gelernter, the idea of the “digital twin” was first applied by NASA in the 1960s with the Apollo program. It was the space agency that coined the term Digital Twin.
The concept of a digital twin consists in reconstituting objects, processes or physical services in a virtual environment. Its use contributes to improving the design and functionality of systems, optimize their maintenance and diagnose possible problems. Digital Twins have also turned into powerful decision-making tools for strategic planning.
What is the principle behind digital twins?
A digital twin (DT) is a virtual representation of a service or physical object. It ranges from the simplest to the most complex objects such as components, mechanical parts, gears, buildings, cities or electrical networks as big as a country. It also includes the digitalization of industrial processes.
The digital twin generates simulations in order to observe a potential future scenario. The outcomes could change depending on a variety of different factors such as environmental conditions.
The digital twin helps to shorten the duration of the design phase, as well as reduce operating and maintenance costs. The use of digital twins is often combined with other digital technologies, such as the Internet of Things (IoT), artificial intelligence, cloud computing,. Main applications fields are found in industries as diverse as healthcare, aerospace, energy, and automotive.
Product DT
A virtual representation of a single component or a larger set of a physical object, such as a car engine or a road bridge.
Process DT
A numerical view of an entire manufacturing process or a logistics and supply chain flow.
System DT
A generated multidimensional image of a more complex system such as a building or even a city.
Operating possibilities and design of a digital twin
Digital twins are tools available to a wide range of users and many different functions can benefit from their implementation:
- Designers and engineers can build network architectures that are optimized for efficiency and cost, or simulate the resistance of a machine under severe environmental conditions (for example, the behavior of an aircraft turbine vibrating at a high frequency).
- Supply chain and production managers can monitor systems, such as an electrical network, logistics flows, and anticipate malfunctions or failures (for example, they can anticipate the impact over the entire chain of supply of a major disruption of logistics flows or raw material shortage).
- Investment planners and managers can assess the impacts of alternative arbitration scenarios between maintenance opex and investment capex.
Digital Twins rely on three major building blocks:
- Collection & organization of real world data to create the virtual replica. This step is not only based on data but also collects physical equations, modelling when it exists of the interaction between the components of the system.
- Processing by the user of the data through an interface to perform configuration as well as visualize the simulation results and therefore “interact” with the digital twin
- Analytics and computing power made possible by cloud technologies capable of processing massive amounts of data and modeling very complex multidimensional systems and their interactions
As is true with any major digital transformation program, implementing cybersecurity risk assessment routines, mitigation procedures, and a dedicated organization are important pre-requisites before launching a digital twin program.
By 2027, companies and other players in this market are expected to spend up to $73 billion on digital twins, and the market is expected to grow by 30 to 45%. In addition, the digital twins could increase speed to market by 50% and the quality of products offered by 25%.
What are the main advantages of digital twins?
Companies that embrace a dedicated digital twin design strategy can unleash substantial value
- Ability to make more informed decisions in complex environments through a better understanding of the impact on multiple indicators of multiple possibilities. it is a decision support tool that is capable of running thousands or hundreds of thousands of scenarios and analysing their consequences and bottlenecks
- Strengthening risk management by testing mitigation plans to respond to extreme scenarios (for example, simulating the spread of a fire in a building to identify optimal escape routes)
- Ability to react in near-real time to the status of critical equipment (for example by balancing the load of the power grid to reduce or eliminate local congestion)
- Shorter new product development cycles through virtual testing of alternative prototypes, designed-to-cost
- Reduced operating costs through an optimization of the productivity and efficiency of manufacturing lines.
- Improved product quality through real-time sensor monitoring and better control of production process parameters
- Creation of customized services and new offerings and business models: moving from periodic maintenance to predictive maintenance
- Knowledge management and sharing such as the codification of informal, best practices implemented on the shop floor into standard operating procedures
How Nexans uses digital twins for decarbonized electrification?
The world around us, our lives, and our mobility will need to be more electric in the future as electrification is one of the immediately actionable lever to fight and limit the impacts of climate change. The energy grids must be reliable, however, as an electric future will not afford blackouts. The more dependent on the grid we become, the more resilient these systems must also become.
In partnership with Cosmo Tech and Microsoft, Nexans is developing a digital twin solution dedicated to electrical networks. The grid operators will benefit from a powerful software allowing them to reduce their carbon footprint by adopting new investment and maintenance policies while at the same time preserving their profitability by maximizing the value of their infrastructure.
Leveraging real world data, Nexans acts also on the installation, operation, and maintenance of electrical networks.. Leveraging the data from sensors installed on the network, Nexans provides a near real-time view of the areas of congestion on the network and can also detect and localize imminent failures before they occur.
To achieve its objectives in terms of financial, environmental and social dimensions, Nexans has built its own Digital Twin solution E³, a business performance tools that is as powerful as it is unique. This tool measures and monitors the performance on the basis of three KPIs, i.e., Return on Invested Capital, Environmental Return on Carbon Employed, and Return on Skills Employed.
What is the main takeaway of digital twins?
The digital twin brings augmented intelligence to human skills. Its design and deployment use modeling tools, data analytics and lots of computing power to predict different outcomes to scenarios across business. One of the great advantages of digital twins is that they take into account future data, future interactions or equations between components that do not necessarily exist in past data.
Sitting at the crossroads of artificial intelligence, data analytics, and the Internet of Things, digital twins open untapped pools of productivity and performance to a variety of users. From engineering and supply chain managers to c-level decision makers, digital twins move the future forward.
Author
Olivier Pinto is Nexans Innovation Director in charge of services and digital solutions for power grids. He leads a team of grid experts developing a portfolio of innovative offerings designed to solve the issues and address the challenges faced by electrical network operators, leveraging on a solid ecosystem of technology partners. Olivier joined Nexans in 2001 and has held various R&D, operational and sales & marketing positions. He holds a M.Sc. from the School of Chemistry, Physics & Electronics of Lyon, France.