Susan Etlinger, an industry analyst at Altimeter Group, was one of the Keynotes at Smart City Expo World Congress 2016. Her topic was “Artificial Intelligence in the Digital City” (see figure). I sat down with Susan to talk about the impact and challenges that Big Data and artificial intelligence (AI) will have on cities and society in general.
How is Big Data being handled?
Right now, it is a bit different from country to country. For example, in Germany, it is highly regulated. In the U.S., it is less regulated. Then, there is the General Data Protection Regulation [GDPR, a European Union initiative], so the whole world is dealing with this in different ways.
I think the most important thing for organizations is to be able to think about what questions they want to be able to answer and what services they will be able to provide, and then use data related to that (as opposed to collecting data and then later figuring out what to do with it). I think part of the problem now is that there is so much data but not that much information. Just because a sensor collects data does not necessarily mean that there is a good use for the data. There might be things that can be interesting but not valuable.
I think one of the things we need to think about is the context in which we collect data, in addition to the potential uses that we want to put the data. Then also do what is called “scenario planning” to try to determine what the best thing is that could happen, and the worst thing that could happen.
How can artificial intelligence be used in smart cities?
The massive amount of data makes AI different now than before. In addition, the algorithms are getting better and computers are able to handle Big Data more quickly. AI becomes interesting for smart cities when AI developers create systems that can learn from past experiences. For example, in a system where energy spikes tend to happen, AI can learn where they usually occur and under which circumstances. You can then make better use of your power grid. Other examples could be systems that, by learning, can provide services to disabled people or elderly people who might not have the opportunity to go grocery shopping, for example.
Susan Etlinger gives her keynote at Smart City World Congress 2016.
I think the uses for AI are almost infinite. It is just a question of what are the right things to do, and being conscious of commercial potential and the potential downsides. The advantage is the ability to fix problems as they are beginning to happen instead of long after they have happened.
How will AI systems be deployed?
It is very complicated. Part of the challenge we have now is that a lot of AI’s resources and power have been gathered by very few companies. So there is sort of an AI monopoly that has started to happen. Google, Microsoft, and Facebook have all started AI ecosystems. In the future, we need to think about AI as a service and a utility itself. For now, at least, it seems that it might be good to start with the idea of having a service where you can be paid for volume of data, or paid per hour or per project to have access to that technology and apply it for particular use.
Right now, not enough people know how to code and build intelligent systems at that level; the universities aren’t putting them out fast enough. That will probably change over time and we’ll have a better workforce and artificial-intelligence systems.
Will everybody get access to artificial intelligence?
I think in the future, AI is going to be a utility. It is going to be as normal as cloud computing today. It has taken 10 years for cloud computing to become something that companies accept, and some companies still don’t accept it. But I think it is going to take at least 10 years before we really have broad access to AI. Today, if you look at AI, it is Facebook, it is Google… basically, it is a handful of companies and a whole bunch of startups. We are at a real crossroads with technology. Part of the challenge with technology that thinks is that it can learn the bad things as well, so we have to make sure we are careful with data. We don’t want to accelerate things that could be disruptive to society and to businesses.