广告
博通收购赛门铁克、光子和量子计算、自动驾驶测试软件
00:00
00:00

BRIAN SANTO: I’m Brian Santo, EE Times Editor in Chief, and you're listening to EE Times on Air. This is your Briefing for the week ending August 16th.
BRIAN SANTO:我是EE Times总编Brian Santo,你正在收听的是EE Times全球联播。这是截至8月16日的一周要闻。

Photonics – it’s not just for fiber optics anymore. In this episode, we’ve got a discussion about photonics, quantum sensors, and the potential for an all-optical computer.
光子学 - 它不再仅仅用于光纤了。在本集节目中,我们讨论光子学、量子传感器及全光学计算机的潜力。

Broadcom bought Symantec last week. We ask editor Rick Merritt, Why on Earth a chip company would want to get into the market for business software?
博通上周收购了赛门铁克。我们咨询编辑Rick Merritt,为什么一家芯片公司想要进入商业软件市场?

Over the years, the EDA industry has developed some marvelously sophisticated tools for testing and verifying the designs of highly complex integrated circuits. This week we have an interview with the CEO of a startup – a company that has its roots in EDA – about the tools it has developed to improve the testing process for autonomous vehicles. The tools will help AV companies determine if they’re testing what they think they’re testing.
多年来,EDA行业开发了一些非常复杂的工具,用于测试和验证高度复杂的集成电路设计。本周,我们采访了一家源于EDA的初创公司CEO,论及 关于它为改进自动驾驶汽车测试过程而开发的工具。这些工具可帮助自动驾驶公司确定他们是否真正在测试他们认为应该测试的内容。

ZIV BINYAMINI: The autonomous vehicle industry is reporting the number of miles driven, the number of miles driven physically on the streets, and the number of miles driven virtually in simulation. In fact, when you look at number of miles – how do you know when we’re done? Is it after ten million miles? Or eleven? Or maybe 100 million miles? Or a billion miles? It’s just not a good metric to know when you’re done.
ZIV BINYAMINI:自动驾驶汽车行业所报告的数据包括行驶里程数、实际在街道上行驶的里程数,以及模拟驾驶的里程数。事实上,当你看到里程数 - 你怎么知道我们什么时候完成?它是在一千万英里之后?还是一千一百万英里?或者可能是一亿英里?或是十亿英里?要知道什么时候完成,这可不是一个好的指标。

BRIAN SANTO: We’ll get back to autonomous vehicle design in a moment. First up today…
BRIAN SANTO:我们稍后会回到自动驾驶汽车的设计话题。今天先来看......

Companies seem to make a big deal about focusing on their core businesses. And then last week chip vendor Broadcom bought Symantec, which produces business software. I’m not seeing a core here. In fact, we might be outside the orchard on this one. Rick Merritt covered the acquisition for EE Times. International editor Junko Yoshida caught up with him to find out what’s going on here.
公司一般会围绕其核心业务做出重大收购兼并。上周,芯片供应商博通收购了商业软件开发商赛门铁克。我看不到这次收购的核心。事实上,我们可能跑到果园外去了。Rick Merritt报道了这个收购新闻。我们的国际编辑Junko Yoshida跟他一起展开这次收购的讨论。

以下是英文原稿,Enjoy!

JUNKO YOSHIDA: So what’s behind Broadcom’s decision to buy Symanetc?

RICK MERRITT: Like any company's chief executive, Hock Tan just wants to grow his company, Broadcom, into the biggest company he possibly can. What's difference in this case is his philosophy of how to do that.

JUNKO YOSHIDA: Why is it a good idea for Broadcom to have Symantec under its roof?

RICK MERRITT: Tan makes what's a pretty compelling point that now that he can no longer grow his chip company into a bigger chip company after his attempt to acquire Qualcomm and NXP was nixed, he wants to grow a different kind of company: a chip company that also has a software company in it. And he makes a good point that there's always going to be a whole lot more end users that need to buy software licenses for key infrastructure like security than there are going to be OEMs who want to buy chips. So that's a potentially much larger total available market than any of his competitors will have.

JUNKO YOSHIDA: As you noted in your own story, "Intel Acquired McAfee But Sold It," Intel also bought Wind River but then sold it earlier this year. The combination of a chip giant and software companies didn't work out. But is that because Intel was being Intel? Is Broadcom another story? Does Broadcom have a strategy to make this work? If so, how?

RICK MERRITT: After a time, it became clear that the kind of synergies that Intel was seeking just weren't there, and Intel spun it out again. This time around, Hock Tan wouldn't come out and say as much in the conference call that he gave, but the implication he gave was that he's not really looking for synergies with software helping enable new kinds of chips. He's just trying to create a separate software business to a separate end user clientele, and that's going to make his business grow he thinks by $2 billion of revenue as soon as the deal closes. So I think Wall Street's buying it.

BRIAN SANTO: Rick’s story on Broadcom’s acquisition is on the web site.

This week, European correspondent Nitin Dahad filed a story that covered two R&D developments, both involving photonics. The research is all very preliminary, but the first project points toward what they're calling "quantum sensors." The second involves the construction of nanoscale photodiodes, which in turn suggest the possibility of construction of processors that use light instead of electrons.

It’s all rather esoteric. So Junko asked Nitin to begin at the beginning.

JUNKO YOSHIDA: I'd like to cover the fundamentals first. I've always thought that main application for optical circuits is in the area of fiber optic communications. But beyond that, I'm clueless. Nitin, please explain why we should care about photonics IC.

NITIN DAHAD: I must admit I'm not an optical circuits expert either, but I was chairman of The Center for Integrated Photonics in the UK a few years ago, when we sold it to Hauwei. So I have a little bit of knowledge.

Coming back to answering your question, Why should we care?, the simplest way to look at it is this: Photonics technology uses light, or photons, instead or electrons to carry information. When you think about it, photons traveling at the speed of light within optical circuits potentially means much faster data transmission and much less energy consumed.

JUNKO YOSHIDA: Wow.

NITIN DAHAD: So think about the implications for that, especially when we, in our industry, are constantly feeling challenged by the boundaries of ever-shrinking process technologies, Moore's Law, the data bandwidth limitations.

JUNKO YOSHIDA: Yup.

NITIN DAHAD: So for all the brain-like computers and neuromorphic computing, an all-optical chip could be the answer to mimicking neurons and attaining the interconnectivity and efficiency of the brain.

JUNKO YOSHIDA: In the story you cited two recent optical advancements made in labs independently by two institutions. One was Technical University of Munich, and another is Stanford University I think. What are the most notable things discovered by each research?

NITIN DAHAD: So I think the advances of both are fascinating and significant. And by the way, they are no means the only work going on this area. There are many research institutes probably working in it. It's just that these two came to my attention because they made a noise about it. And that's the point.

So the point of the photonics developments, both of them, is all around how you insert and manipulate light sources. So coming to the Technical University of Munich, they led a group of scientists from Germany, the US and Japan to actually put light sources into nanoscale semiconductor materials with great accuracy-- just a few nanometers-- using a specialist helium ion microscope to irradiate the material with precision. Now this is an experimental gateway to integrating quantum light sources into photon circuits. so that will enable things like quantum sensors to be built into smartphones. Or we talk about IoT security, secure encryption for data transmission.

And then at Stanford University, researchers actually designed a nanoscale photon diode and created all the necessary nanostructures and light sources to help bring the photon diode to life. At this stage I think they've just modeled it and done the calculations. What I've understood is they've figured out how to manipulate light in both directions in a light-based diode. So they create the light rotation in a crystal using another light beam rather than a magnetic field. What that means is, the diode doesn't have to be so large, so then it can be integrated into small components.

JUNKO YOSHIDA: Gotcha.

NITIN DAHAD: One of the grand visions stated by the researchers is to have an all-optical computer where electricity is replaced completely by light and photons, which drives all the information processing, which is what you said at the beginning. The increased speed and bandwidth of light would then enable faster solutions to some of the hardest scientific, mathematical and economic problems.

JUNKO YOSHIDA: But how far are we from the future of this all-optical computer you just talked about? You know, supposedly electricity is replaced by light, and photons drive all information processing. When are we going to have that? And what are the challenges?

NITIN DAHAD: Well, this is really hard to tell. I haven't seen anything from the two teams to indicate time scales. When you think about it, I guess one of the challenges is how to fabricate all-optical gates or all-optical logic components. And to give you an idea, while researching this, I saw a paper from Aalto University in Finland from last year, how they described the development of nanowire networks that can perform binary logic functions such as and/or, nand and nor. So I guess we're probably still-- if we say five years away at least from doing something like that. But who knows? They say "necessity is the mother of invention." And if an IBM or an Intel or even one of the hyperscaler companies we talked about recently put enough money behind it, it may just be sooner than we think.

JUNKO YOSHIDA: Let's hope so. Thanks, Nitin.

NITIN DAHAD: Thank you, Junko.

BRIAN SANTO: That was Junko Yoshida with Nitin Dahad, who called in from London. You can read Nitin’s story on eetimes.com. It’s called “Optical Advances Pave Way for Quantum Sensors and Computing.”

Autonomous vehicle companies have been testing driverless vehicles on the road – of course. But they are also putting their autonomous driving systems through driving simulations. Doing both – real-world testing and simulations – is a tried-and-true method for product safety.

The problem with simulations for self-driving vehicles, however, start with the fact that there are no standard tests for driving scenarios used in simulations. There aren’t even any standard metrics with which you could make standard tests. Furthermore, every AV company considers its test data to be proprietary. AV companies boast that they’ve driven millions of simulated miles, which is good, but nobody has any idea if any of those simulations are of any use, which is bad. Really bad. Because it creates a false sense of safety.

Last week, International Editor Junko Yoshida got an exclusive interview with Foretellix, a startup company in Israel that provides a tool to measure test coverage for AV developers. Foretellix asks the question: Are simulations covering the functions they’re supposed to cover? The company’s tools are used to answer the question. Here's Junko with the CEO of Foretellix, Ziv Binyamini.

JUNKO YOSHIDA: As autonomous vehicle companies are racking up more miles on public roads, in our recent chat you made a point that this shouldn't be about quantity of miles, but it should be about quality of coverage. Let's start from there. What did you mean by quality of coverage?

ZIV BINYAMINI: So we have to ask first of all, the autonomous vehicle industry is reporting the number of miles driven. Number of miles driven physically on the streets, the number of miles driven virtually in simulation. And they've already done millions of miles and maybe billions of miles virtually. But you have to ask yourself, What have they actually done? What have they actually simulated? They look at simulation, they drive in the rain, they drive with pedestrians and the combination of these. In fact, when you look at the number of miles, do you know when we are done? Is it after ten million miles? Or eleven? Or maybe a hundred million miles? Or a billion miles? It's just not a good metric to know when you are done.

So when we're talking about autonomous vehicles, we are also talking about something that is not controllable. We cannot tell the autonomous vehicle what to do. So we can't just say we're going to test this and this is what's going to happen. It may do whatever it chooses to do, because it's autonomous. So the coverage approach allows you to actually define metrics, all of the scenarios and parameters, the millions of them, and measure them independently of whatever the platform is and whatever the test intent is to actually see what actually happens, and integrate it into a common result.

JUNKO YOSHIDA: My second question is that safety is obviously the foremost concern for designers of autonomous vehicles or advanced driver assistance systems, ADA systems. But how they actually measure safety is a big question, right? So tell us what you're proposing here.

ZIV BINYAMINI: So first of all, we are building on top of existing work. There are standards being created like (Sotif?) and UL4600 that are being defined. But what is missing and what I think our key contribution to the topic of safety is measurable safety. So you want to be able to do all of the augmentation, all of the preparations, but we need a way to know what have been tested. What scenarios have been exercised? Have you exercised all of the possibilities? Have you been able to find all of the unknown risks? And our coverage measurement approach is enabling us to actually provide a way to provide these metrics. So it's a measurable, quantifiable approach that compliments the other methods.

The other point I want to make is, this method is not an external effort that is another build on the autonomous vehicle development. It is part of the development process. You use the same process to find the bugs, to define the next steps and then to reach your definition of "done." So it's both surface safety and the actual process of making it safe.

JUNKO YOSHIDA: Right. Okay, so that I think you described during our interview. Something called "measurable scenario description language." I think you called it M-SDL. Is that correct?

ZIV BINYAMINI: Correct.

JUNKO YOSHIDA: That's what your company has developed. Now, my understanding is that you have grown up in the EDA world with your eyes focused on verification. So I want you to explain connections or commonalities in principles between measurable scenario description language and things like System Verilog, a standard language used for verification in the chip design world.

ZIV BINYAMINI: Right. So yeah, the founders of Foretellix, the origins are in the chip industry. We actually come from companies like National Semiconductor and Intel. And then moved into EDA to define the solutions over there, and provide them to the whole industry. At the high level, the similarity is very similar in two aspects. One, it's the very high level of complexity at scale. These systems, you know, a system on a chip or a microprocessor is incredibly complex, and it needs to be manufactured in scale. It's not a one-off. You have to create millions and millions of these devices.

JUNKO YOSHIDA: Right.

ZIV BINYAMINI: The other thing is the cost of failure. The cost of failure in the chip industry is because once you commit to silicon, you cannot fix any bug. You have to throw away silicon and redo the whole thing with (colly thrispians?). The costs are in tens of millions of dollars. The obvious cost, when you look at the autonomous vehicle industry, complexity is even harder, it's even higher, and the cost of failure is very obvious, right? It's life of people and businesses that will not be safe, will not be able to... So that's why we thought that the approach that we have developed on the chip industry in the EDA side can be applied here.

Now the concepts are very similar, and there are a lot of concepts that are being brought already. One is the need for large-scale simulation. In the chip industry, people run hundreds, millions of simulations. The autonomous vehicle industry actually there was some debate on this a few years ago. I think now everybody realizes that it has to be large-scale simulation. In addition to other platforms I mentioned.

Two is the need for high-level verification language. The case of the chip industry, there are several of these like the e-language that we created and System Verilog and portable simulu spec, that were just released recently. These are all high-level verification languages. We are coming up with a similarly with a domain-specific language, measurable scenario description language, for autonomous vehicles and ADAs.

Other key concepts are the need for constraint random. So the ability to generate many, many different tests randomly within constraints to look after unknown and unexpected bugs. And unconsidered combinations. And the other is the concept of coverage. The ability to define coverage and then measure objectively what you've actually tested. Because in the chip industry also, we are talking about millions of coverage points, and we also talk about while it's not autonomous, it is extremely hard to control. It is extremely hard to create the conditions that you want within a very deep microprocessor.

And the last concept that is very similar is the need for this language to work, to be portable across multiple testing platforms like street-driving-like simulation, like test-like etc.

JUNKO YOSHIDA: Got it. You mentioned several languages actually being used, high-level languages used for the chip world. But is Foretellix the only company, your company is the only company developing such a language for designing autonomous vehicles? Will there be more than one language?

ZIV BINYAMINI: I would hope that we are the most advanced because of our vast experience. But there are several other attempts to create scenario description languages, and there is actually a standardization process going on within the (?) organization that is trying to come up with a standard language. And the industry hope is that a single standard will emerge that everybody will be able to use so that there is a standard is our description language that people can exchange and share these scenario specifications.

JUNKO YOSHIDA: But you said that your high-level language will be made available to the public, available in Github when?

ZIV BINYAMINI: In a few months. We are actually working right now to collect feedback from some of our partners, and we are going to release it to a few more partners soon. Then after the summer, we are going to, once we get all of the feedback, we consolidate it, we integrate it into our current specification, we are going to open the language, M-SDL, and put it on Github, as you said.

JUNKO YOSHIDA: All right, very good. Thank you so much for your time.

ZIV BINYAMINI: Thank you.

BRIAN SANTO: Check out Junko’s story on Foretellix on the web site. It’s called “EDA and AVs Find a Common Language.”

And now, Sherman, if you’ll just step into the Wayback machine, We’ve got it set for…

…August 12, 1981, when IBM introduced the first IBM PC, the model 5150. It was built around a blazing-fast 4.77 MHz Intel 8088, and ran Microsoft’s MS-DOS. Hey, Max, do you remember… floppy disks?

Also August 12th, this time in 1908, the first Model T came off the production line at Ford Motor Company. Assembly line production helped Ford cut costs and sell cars for less than anyone else. Twelve years later, roughly half the cars on the road in the US were Model Ts.

On August 11, in 1888, Edison’s phonograph was demonstrated in London for the first time. Now it had been demonstrated before at home in the US, but the London debut is notable because one of the very first musical recordings ever made… that we still have a copy of… was played during the event. It was Arthur Sullivan’s composition, “The Lost Chord.”

(AUDIO: “The Lost Chord”)

What you're listening to is that very recording.

(AUDIO: “The Lost Chord”)

And that’s your Weekly Briefing for the week ending August 16th.

This podcast is Produced by AspenCore Studio. It was Engineered by Taylor Marvin and Greg McRae at Coupe Studios. The Segment Producer was Kaitie Huss.

The transcript of this podcast can be found on EETimes.com, complete with links to the articles we refer to. We’ll have a new episode on the 23rd. Look for it on our web site, or you can find it on Spotify, on iTunes and on Blubrry, and anywhere else fine podcasts are found. This is EE Times on Air. I’m Brian Santo.

感谢收听本期推送,全球联播 (EE|Times On Air) 现已同期在喜马拉雅以及蜻蜓FM上线,欢迎订阅收听!
广告