Intel talks CPU exploits, virtual reality and autonomous cars in its CES 2018 keynote address
The threats posed by Spectre and Meltdown didn't stop Intel from showing off its latest innovations.
In advance of CES 2018 kicking off tomorrow, Intel CEO Brian Krzanich today delivered his company's official keynote address to media in Las Vegas, outlining the latest initiatives the technology giant plans on rolling out this year.
However, before Krzanich unveiled the data-driven projects Intel has been working on, the CEO took a moment to confront the elephants in the room: Spectre and Meltdown, the two recently discovered and incredibly potent security exploits that have rocked the tech industry over the last few weeks.
Along with thanking the many other companies working alongside Intel to patch the exploits, Krzanich promised that updates for affected Intel processors are on the way, with the aim of issuing updates for over 90% of those processors within the next week and the rest by the end of the month.
With the rough stuff out of the way, Krzanich moved onto Intel's plans for 2018. First up: autonomous cars. Having purchased the Israeli technology company Mobileye last year, Intel aims to roll out Mobileye's Road Experience Management (REM) system to 2 million vehicles from BMW, Nissan and Volkswagen throughout 2018. REM will use cameras installed on these vehicles to produce high-definition road maps far quicker and at a much lower cost than other mapping companies – or so Intel claims.
Those maps will likely prove useful for Intel's foray into developing its own automated driving platform. Powered by Mobileye's EyeQ5 chips and custom-built Intel Atom processors, this new initiative remains under wraps for the time being, with Intel set to reveal more details in a press conference tomorrow.
One more bit of driving news is for race fans: Intel is partnering with Ferrari to bring the power of artificial intelligence to motorsports. When the Ferrari Challenge North America Series kicks off this year, broadcasts will make use of a variety of Intel AI technologies to identify when important events occur during a race, taking the onus off human broadcasters to keep up with every car in the race at once.
AI will also analyse drivers' performance from the air using drone cameras, relaying that information both to the drivers and to the broadcasters. Machine learning algorithms will use the data to provide real-time feedback on driver performance – something that is currently only possible after a race has finished.
Intel's last announcement marks its first big step into the entertainment industry with Krzanich revealing Intel Studios, a high-tech video-production studio dedicated to creating large-scale volumetric content. Volumetric content is a video-capture technique that scans real-world objects and environments into the digital domain using a multi-camera array, producing a three-dimensional representation that is especially useful for creating virtual reality experiences.
This massive new studio has already garnered some serious attention from big names in the entertainment industry, with Paramount Pictures signing on to explore its potential in Hollywood.
That same volumetric technology will also be used in Intel's other big entertainment project: VR coverage of the PyeongChang 2018 Winter Olympic Games. When the Games kick off later this year, Intel will live stream 30 Olympic events in virtual reality, with each event supporting between three and six different camera locations for viewers to watch from. Each viewpoint will also feature a unique audio mix of the natural background sounds captured by the cameras at that location, adding to the sense of actually being there.
Complementing the event coverage, Intel will also offer a virtual reality tour of the various Olympic venues, with the ability to fly around at your leisure and soak up the atmosphere from the comfort of home.