EETimes On Air(EETimes全球联播)是ASPENCORE专门为电子业界人士提供的15-30分钟电子行业新闻播报节目，由全球EETimes媒体记者、编辑、分析师与业界专家的访谈录制而成。《电子工程专辑》现为大家呈献这一原汁原味的英文新闻，希望大家在了解最新电子工程领域新闻热点的同时，也能提高自己的英文水平。
HEADLINE: Google I/O, AI Fairness and Women in Tech
JUNKO YOSHIDA: This is Junko Yoshida, EETimes Chief International Correspondent. You're listening to EETimes on Air.
DAVID FINCH: This is your EE Times Weekly Briefing. Today is Friday, May 10th, and among the top stories this week:
DAVID FINCH: 又到了EETimes全球联播时间，今天是5月10号，星期五。
*Google I/O, Google's developers’ conference. CEO Sundar Pichai touted Google’s awakening to privacy for its users’ data.
* Google I/O，即谷歌开发者大会。CEO Sundar Pichai吹嘘称谷歌很懂用户数据的隐私；
*This week, EE Times launched a new Special Project package on Artificial Intelligence, with a particular focus on AI fairness. We ask and answer the question: “Will Machines Ever Learn to Be Fair?”
* 本周，EE Times推出了一个关于人工智能的专题项目，特别关注AI的公平性。我们提出并回答这个问题：“机器可否学会公平？”
*Later on, we’re joined by Junko Yoshida, EE Times’ chief international correspondent, and EE Times executive editor Dylan McGrath. The two editors moderated panels at VerveCon in sunny Santa Clara. They share their observations at this unusual tech conference, where the main auditorium was filled not by male but female engineers.
* 稍后，EE Times首席国际记者Junko Yoshida和EE Times执行主编Dylan McGrath加入我们。两位编辑在阳光明媚的圣克拉拉VerveCon主持了一个专题研讨会。他们将与大家分享这个不同寻常的技术会议的讨论热点，这个会议的参会者不是男性，都是女性工程师。
All that to come, but first, here’s Rick Merritt, EE Times Silicon Valley bureau chief. Rick, who attended Google I/O, met up with Dylan afterward. Here are our two editors discussing the highlights of this year’s Google developers' conference.
所有这一切都将一一呈献，但首先有请EE Times驻硅谷记者站主任Rick Merritt。他参加了Google I / O大会，然后与Dylan会面。以下是我们的两位编辑对今年Google开发者大会亮点的精彩讨论。
DYLAN McGRATH: Rick, I understand you attended Google I/O today, and can you give us kind of a high level view of what you saw, what you heard, how it went?
RICK MERRITT: At the highest level, the script that Google CEO Sundar Pichai had about AI everywhere and his high concern for privacy could have been the same script that Mark Zuckerberg used last week at the Facebook developer conference. They're both really concerned about the increasing government scrutiny about how they're selling and sharing and what they're doing with people's personal data, particularly as they're mining more and more, but with AI.
So the focus was, "We're good players here. We're really concerned about your privacy." And there was almost more concern about these social issues than there was about commercial stuff.
Though some news did get announced.
DYLAN McGRATH: And what was some of that news.
RICK MERRITT: Google came out with some sort of middle market pixel smart phones and a new home display to enhance their smart home story. But really, the underlying thread to all of it was that they're doing more and more AI everywhere, and they're trying to do more of it on device, so that you don't have to go to the Cloud. And that supports their privacy story.
I think the interesting thing there, though, is you can see where they're wanting to get some commercial advantages, so Google did demo some work on voice interfaces that are getting better and better on the smart phone. So they showed somebody being able to walk through multiple applications and do a mixture of commands and dictation. And their assistant understood what they were doing in the canned demos, and did a pretty good job of it. So their comment was, later this year, the software's going to roll out for their pixel phones, and the voice interface will be faster than using a touch screen display. That's a significant advantage for them.
DYLAN McGRATH: Absolutely. Well, I guess the similarities between that and the Facebook event really show where Silicon Valley's head is these days with regard to these privacy issues.
You wonder if it's just lip service and if they're serious about this, or if they're just trying to stay out of trouble. Off the government's radar.
RICK MERRITT: Yeah. There's definitely a lot of lip service there and some big fines coming up, but we'll see.
DYLAN McGRATH: Okay. All right, Rick. Well, thank you very much.
RICK MERRITT: Good talking to you, Dylan.
DYLAN McGRATH: Good talking with you.
DAVID FINCH: And now, Junko explains why EE Times has gone after the loaded issue of “fairness” in AI. This ambitious Special Project, shepherded by EE Times senior contributing editor Ann Thryft, digs deep into the challenges of AI. Predictably, Ann found out that most marketers of AI systems-- or so-called “AI solutions”-- prefer not to talk about fairness. However, AI researchers were much more candid.
JUNKO YOSHIDA: This year, Rick Merritt, EE Times Silicon Valley Bureau Chief, put together a stellar Special Project, It's Still Early Days for AI. Rick covered a wide range of AI issues from deep learning models and neuro networks to AI chips for both learning and inference.
We follow AI aggressively because we know big changes in AI will impact next generation computer architecture. Our job at EE Times is to be there when the impact hits so we can explain what happened and what comes next.
This week, we launched a new Special Project on AI, shepherded by EE Times Senior Contributing Editor Ann Thryft. This time around, our focus is on the fairness of AI.
Unlike the topic like AI performance-- which is typically explained and measured in teraflops, gigahertz and watts, we decided to pursue this more elusive concept of fairness. Why, you may ask? Because in the pursuit of automation, humans are beginning to cede to machines whole realms of decision making for tasks like hiring, credit scoring or customer services, and even driving. When we take people out of the loop, we assure ourselves machines are more efficient. They do the job faster, cheaper and more accurately, without the blunder, bias and fatigue that afflict mere mortals. Or do they?
In recent years, AI researchers have realized AI is not so accurate as it's cracked up to be. As Ann Thryft, born skeptical, points out. Most of the fairness of an AI decision making depends on the accuracy and completeness of the test data sets used for training algorithms. This is an elegant way to say garbage in/garbage out. A machine's decision also depends on the accuracy of its algorithm itself and how it understands success. An optimization strategy by training algorithms can actually amplify by us sometimes.
The black box nature of such algorithms is also worrisome, making it almost impossible for humans to explain how machines reach certain conclusions.
The engineering community has made remarkable progress with AI. We all applaud it. But we pause the clapping when we hear the AI developers say, "AI works most of the time." Most of the time isn't good enough. People will expect the machines to decide soundly, safely, accurately and fairly. People hold other people accountable. They expect no less from AI. In engineering, we often say, "security by design." We recognize it's high time for an engineering community committed to AI development to start thinking about fairness by design.
Our AI Fairness Special Project includes real world cases of bad AI behavior, coupled with discussion of an emerging framework and potential standards that define AI fairness. We ask, and answer, whether AI fairness can be regulated. We also explore tools-- although not many yet-- under development designed to de-biasing or auditing algorithms and data sets.
Author Ann Thryft poses five big questions about fairness of AI to researchers from MIT, Stanford and IBM. The bottom line is simple: We can't just assume AI will be any fairer and more accurate than its human parents. We need designers of AI software and hardware to start thinking about fairness before they embark on their next AI project. You can't add fairness to your system as an afterthought.
This is Junko Yoshida, EE Times.
DAVID FINCH: Sally Ward-Foxton, EE Times European Correspondent, who also contributed to our Special Project on AI fairness, explains now, more specifically, how financial institutions are increasingly using AI-- particularly in machine learning-- to make decisions on credit scores, credit risks and lending, and where bias creeps into the process. Here's Sally with more.
SALLY WARD-FOXTON: Financial institutions have embraced AI and machine learning technology to determine consumers’ credit scores and decide on their loan applications, because the technology can consider large amounts of data and make quick and accurate decisions.
The trouble is, even though there are no people involved in making the decision, studies have shown that these systems can still exhibit unintentional bias against minority groups. This is despite the law in the US that makes this type of discrimination illegal.
When we’re talking about consumer credit scores and loan decisions, there's obviously a lot at stake. When you’re deciding who gets a loan, you might be deciding whether that person can own their home, whether that person goes to college, or whether that person can cover their medical expenses. If the decision-making process is biased against any group of people, there are big implications for society as a whole.
So how can banks check for bias in their model, and how do they fix them? These systems consider thousands of variables from each applicant and use techniques like neural networks to model complicated interactions between the variables. As these techniques develop and evolve, the complexity of these models will only increase. In other words, it can be pretty difficult to tell where that bias is coming from, and it’s only going to get harder.
I spoke to AI model fairness expert Jay Budzik. His company, ZestFinance, uses mathematical game theory to analyze banks’ models. They can determine which variables are driving the bias, and then they can tune the model to make a better trade off between accuracy and fairness.
So there are ways of making AI fairer. The real question is, will banks choose to use them? These models are highly optimized for accuracy, to make the banks the most profit, which implies that changing the model in any way might mean they don’t make as much money. So fairness, despite the law, may well be a difficult sell.
This is Sally Ward-Foxton reporting from London for EETimes.
DAVID FINCH: At VerveCon, a conference devoted to women in tech, Junko and Dylan worked together as moderators of a keynote session and a roundtable. Their panelists were, in every sense of the word, "the best and the brightest" in the tech industry today, including distinguished engineers and engineering directors from companies such as Google, Microsoft, Oracle, Linkedin and Intuit.
Here are Junko and Dylan after the conference.
JUNKO YOSHIDA: This a rare occasion I happen to be in Silicon Valley, and I'm with Dylan McGrath, Executive Editor of EE Times. And we just came back from the conference called VerveCon, which is women in tech conference. This is the second year of that conference, and we had about 800 people I think.
DYLAN McGRATH: Give or take.
JUNKO YOSHIDA: Yeah. What was your impression, considering that the 800 people who were there-- probably 99% of those people who came-- were all women?
DYLAN McGRATH: Well, it was quite a difference from most of the conferences, most of the engineering conferences I attend. I fact, I had a conversation with someone in line: basically there were only a handful of men there, and she made the point that this is how she typically feels when she goes to a conference. And it really was quite the reverse. I've never been to a conference like that. So it was eye-opening.
JUNKO YOSHIDA: Yeah. This is a conference, actually, I think it serves two purposes. One is more of a career development conference, but also the founder-- Sudha is the founder of this conference-- she believes in continuous education. That means that there are a lot of technical sessions to develop your career, so everything from AI to block chain to natural language processing. There are a lot of tech seminars, too.
DYLAN McGRATH: Absolutely. That was one of the things that surprised me the most, was that the content at the conference was not just for women.
JUNKO YOSHIDA: Exactly.
DYLAN McGRATH: It wasn't all built around being a woman in technology. A lot of it was just straight technology. And in that sense, other than the kind of demographic of the attendants, which was mostly women, it wasn't that much different than these other conferences.
JUNKO YOSHIDA: Exactly. But another thing that was interesting to me was, Dylan and I moderated two panels, and both panels were excellent, but I felt like we were able to see a little bit behind the curtain of top technology companies in Silicon Valley, what's going on in the work environment, how they actually grow people within the company. You know, for example, like Google, right?
DYLAN McGRATH: Yeah. Yeah. Fascinating to see. I mean, they obviously have their own distinct culture. And I also found, when we talked about this afterwards, one of the most interesting things that was discussed was how being at a company like that, you're just surrounded by the best and the brightest of people who have excelled throughout their college and early career and have always been the smartest person in the room. And now this is the first time that they're not.
JUNKO YOSHIDA: This is the first time!
DYLAN McGRATH: And that's something I never really thought about before.
JUNKO YOSHIDA: Yeah, they're all kind of homogeneous, right? They went to the top schools. They probably haven't had any experience of big failures or anything. So they come to Google, and this is the first time they realize they're just the middle path.
DYLAN McGRATH: Right. Yeah. So that was quite eye-opening.
JUNKO YOSHIDA: So how you differentiate yourself was one of the big conversations.
DYLAN McGRATH: And the answer was given, It's resiliency that makes a leader. Someone that is... Not that they're afraid to fail, but someone who WILL fail and will continually get up and try again.
JUNKO YOSHIDA: Right. And also, I think the emphasis was that, in order to be a leader, you really need to find your allies, right? Whether you are career hopping within the company, you always need to find your allies, your mentors, and then if you're lucky, you get what they call "sponsors." Meaning that somebody who can vouch for you, who can talk up. And it's not a formal relationship, but it seems like the culture is there to help each other.
DYLAN McGRATH: Yeah. And I think that was another one of the really eye-opening parts about the conference. Again, it did focus a lot on career development and the importance of having a good mentor, a good sponsor, someone to serve as a sounding board and help guide your career, help you guide your guide your own career. That's a very interesting concept.
JUNKO YOSHIDA: All right. Well, thank you very much. It was good to see you.
DYLAN McGRATH: Good to see you, too, Junko.
JUNKO YOSHIDA: All right. Thanks.
DYLAN McGRATH: Talk to you soon.
DAVID FINCH: And finally, it's Mother's Day weekend here in the US. And we conclude with a story about saving the lives of mothers and children in underprivileged areas in this piece sponsored by Arrow Electronics.
Recently, I was joined by Victor Gao, Chief Marketing Officer at Arrow, to talk about a project called The Solar Suitcase, which was a joint development project between Arrow Electronics and We Care Solar.
Victor, welcome. And please tell me a little bit about this project. Why The Solar Suitcase?
VICTOR GAO: So every year, more than 300,000 women die in childbirth because they happen to go into labor at night when it's dark, and there's no electricity. So they could bleed out, they could be infected, what have you. And a solution for that is really just lighting. And as we looked at what we do really well at Arrow, we work with We Care Solar to design a suitcase that essentially is a solar battery, solar-powered battery.
DAVID FINCH: I see. And what were some of the challenges that you were solving for together?
VICTOR GAO: As we looked at the design challenge for The Solar Suitcase, it's actually threefold: It has to be lightweight; it has to be cheap; and it has to be durable, considering the field environment into which The Solar Suitcase would be deployed.
So what we did was, we had an engineer who-- her name is Cheri Sanchez, who's worked extensively in this area-- and we created The Solar Suitcase. And we've now deployed it in hundreds of villages. Currently mostly in Africa, but we're also looking at south Asia, South America, even in some of the developed nations; there are certainly rural parts that could use other things than electricity for lighting.
So as a result of deploying these suitcases, we had initially expected the mortality rate for both the mother and the child to go down substantially. It went down to zero as soon as we started deploying these solar suitcases. Because what they do is, they provide medical lighting and power for small medical devices for up to eight hours, just from a day's charging in the sun.
DAVID FINCH: And are these easy to use? Or do they require like a technician to make them operable?
VICTOR GAO: The solar suitcases are very easy to use. They're very easy to install. In fact, most of the time we find that some of the clinical staff-- if you know how to use a screwdriver and some bolts, that's all it takes to get the suitcase mounted on the wall, and the module inside the suitcase is detachable, so you can take it outside and charge it during the day.
DAVID FINCH: Arrow, of course, has a long legacy in volume component distribution and supply chain leadership in the electronics industry. Does the solar suitcase signal a fundamental shift toward solution development for the company?
VICTOR GAO: Arrow is a company-- we are approaching our 85th year since our founding in 1935-- and I think the folks that have worked at Arrow throughout the decades have always had this idea of using technology for good. I mean, when our founders-- the three founders that started Arrow on Radio Row in Manhattan-- initially they were just repair radios, and they soon realized that they could help more people if they went into component distribution. So what I would say is, rather than a shift in terms of how we think about what we do, we see it as a ever-more important to be explicit about what it is that we're trying to do. Which, at the end of the day, a lot of that has to do with moving components around, so physical objects around the world, in the supply chain, which is extremely difficult to do on time, safe, in compliance with environmental standards, import/export controls, what have you.
Let's not forget the other half of our business, which is on the enterprise computing side. So if you think of the entire IT stack that companies anywhere from a startup that is trying build artificial intelligence, machine learning platforms, to all the way large enterprise companies, industrial... but also government. So NASA, European Space Agency, etc. As we think about delivering these technologies and delivering these solutions to our customers, it's always been at the center of what we do, which is we think about how is this going to improve the quality of life for as many people as possible.
And what we found-- especially in recent years-- is that kind of call to action that we've always had internally is resonating significantly with our broader community. And my sense is, everybody always kind of knew that, always were doing that on their own, on our own, but now, because of the rapid proliferation of communication and ability to co-work, the ability to work across the globe, you can actually make a significant impact faster.
DAVID FINCH: Well congratulations are in order, certainly on winning the 2019 Edison Awards. Tell me a little bit about the Awards, the ceremony and what that's all about.
VICTOR GAO: It was a huge privilege, and obviously a lot of fun, to attend the 2019 Thomas Edison Awards gala. Ginni Rometti is a technology leader that I have tremendous respect for as CEO of IBM, was the guest of honor that night. And there was a... It's funny, because right next to the Arrow table had sat probably five to seven scientists that not only developed the world's fastest, but also the world's second fastest supercomputers from IBM. So you just think about the amount of brain power that's sitting right there. And certainly when we think about IBM being a very good partner of ours-- who, by the way, has the world's first commercially available quantum computing made available by the Cloud-- so just to be able to sit in there with our partners at We Care Solar, who really care about the lives of people that are less advantaged than we are in the developed countries. But at the same time being able to sit in the same room with these scientists. It was quite an experience and a huge privilege to share the stage with these brightest minds in the technology sector.
DAVID FINCH: Victor Gao, thank you for joining us.
This has been your weekly briefing from EETimes. You can read all these stories and more at EETimes.com. Thanks for listening.