7.26 1第二篇人脸识别技术也是一种威胁

  1. algorithm [ˈælɡəˌrɪðəm]  n.演算法;运算法则;计算程序
  2. sexuality [ˌsɛkʃuˈælɪti] n性别
  3. homosexuality[ˌhɒməˌsekʃʊ'ælətɪ] n.同性恋关系
  4. prejudice [ˈprɛdʒədɪs] n.侵害;偏见,歧视;伤害
  5. bias[ˈbaɪəs] n.偏见;倾向;偏爱
  6. filter [ˈfɪltɚ] n.滤波器;滤光器;滤色镜;vi.过滤;透过;渗透
  7. ethnicity [ɛθˈnɪsɪti] n.种族地位,种族特点,种族渊源
  8. inevitably [ɪnˈevɪtəbli] adv.难免;不可避免地,自然而然地;
  9. automated['ɔ:toʊmeɪtɪd] adj.自动化的 v.自动化
  10. sentence[ˈsɛntəns] n.句子;宣判vt.宣判,判决
  11. Dissemble[dɪˈsɛmbəl] vt.假装,掩饰
  12. machine-learning机器学习
  13. facial-recognition 人脸识别
  14. Irritation[ˌɪrɪˈteʃən] n.刺激;激怒,恼怒,生气;
  15. calculation [ˌkælkjəˈleʃən]  n.计算,盘算;估计;计算的结果;
  16. Rational[ˈræʃənəl] adj.理性的;合理的;理智的;
  17. transactional[træn'zækʃənəl] adj.交易的,业务的;
  18. embedded [ɪm'bedɪd]  adj.植入的,深入的,内含的v.把…嵌入,埋入
  19. decree [dɪˈkri]  n.法令,命令;裁定;vt.命令;颁布…为法令;
  20. biometric[ˌbaɪoʊˈmetrɪk]  计量生物学
  21. propagate [ˈprɑ:pəgeɪt]  vt.繁衍,增殖;使遗传;扩散;
  22. unintentionally [ˌʌnɪn'tenʃənəlɪ] adv.无意之中;非故意地,非存心地
  23. Bamboozle [bæmˈbuzəl] vt.欺骗,使迷惑
  24. undemocratic [ˌʌndeməˈkrætɪk] adj.不民主的;非民主的

词组

1.Owing to 由于

2.daily life 日常生活

3.cropped up 突然出现

4.facial recognition 面部识别

5.at least 至少

6.for fear of 害怕...

7.applied to 施加

8.belongs to 属于

But the technology also threatens.

Researchers at Stanford University have demonstrated that, when shown pictures of one gay man, and one straight man, the algorithm could attribute their sexuality correctly 81% of the time.

Humans managed only 61%.

In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.

Less violent forms of discrimination could also become common.

Employers can already act on their prejudices to deny people a job.

But facial recognition could make such bias routine, enabling firms to filter all job applications for ethnicity and signs of intelligence and sexuality.

Nightclubs and sports grounds may face pressure to protect people by scanning entrants' faces for the threat of violence—even though, owing to the nature of machine-learning, all facial-recognition systems inevitably deal in probabilities.

Moreover, such systems may be biased against those who do not have white skin, since algorithms trained on data sets of mostly white faces do not work well with different ethnicities.

Such biases have cropped up in automated assessments used to inform courts' decisions about bail and sentencing.

Eventually, continuous facial recording and gadgets that paint computerised data onto the real world might change the texture of social interactions.

Dissembling helps grease the wheels of daily life.

If your partner can spot every suppressed yawn, and your boss every grimace of irritation, marriages and working relationships will be more truthful, but less harmonious.

The basis of social interactions might change, too, from a set of commitments founded on trust to calculations of risk and reward derived from the information a computer attaches to someone's face.

Relationships might become more rational, but also more transactional.

In democracies, at least, legislation can help alter the balance of good and bad outcomes.

European regulators have embedded a set of principles in forthcoming data-protection regulation, decreeing that biometric information, which would include “faceprints”, belongs to its owner and that its use requires consent—so that, in Europe, unlike America, Facebook could not just sell ads to those car-showroom visitors.

Laws against discrimination can be applied to an employer screening candidates' images.

Suppliers of commercial face-recognition systems might submit to audits, to demonstrate that their systems are not propagating bias unintentionally.

Firms that use such technologies should be held accountable.

Such rules cannot alter the direction of travel, however.

Cameras will only become more common with the spread of wearable devices.

Efforts to bamboozle facial-recognition systems, from sunglasses to make-up, are already being overtaken; research from the University of Cambridge shows that artificial intelligence can reconstruct the facial structures of people in disguise.

Google has explicitly turned its back on matching faces to identities, for fear of its misuse by undemocratic regimes.

Other tech firms seem less picky.

Amazon and Microsoft are both using their cloud services to offer face recognition; it is central to Facebook's plans.

Governments will not want to forgo its benefits.

Change is coming.

Face up to it.

=================

But the technology also threatens.

但是,技术也是威胁。

Researchers at Stanford University have demonstrated that, when shown pictures of one gay man, and one straight man, the algorithm could attribute their sexuality correctly 81% of the time.

斯坦福大学的研究表明,在被出示给一张同性恋之人的照片和一张异性恋之人的照片时,算法在81%的时间中可以正确地确定他们的性取向。

Humans managed only 61%.

人类经过努力才才能达到61%。

In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.

在同性恋是一种犯罪的国家中,能从脸部断定性取向的软件是一种令人担忧的前景。

Less violent forms of discrimination could also become common.

不那么暴力的歧视形式也可能变得很常见。

Employers can already act on their prejudices to deny people a job.

雇主早就能够根据自己的偏见拒绝给人们一份工作。

But facial recognition could make such bias routine, enabling firms to filter all job applications for ethnicity and signs of intelligence and sexuality.

但是,脸部识别可能让这类偏见司空见惯,它能让公司为了种族特点以及智力和性别信号而去过滤所有的求职申请。

Nightclubs and sports grounds may face pressure to protect people by scanning entrants' faces for the threat of violence—even though, owing to the nature of machine-learning, all facial-recognition systems inevitably deal in probabilities.

夜总会和运动场可能面对为了暴力威胁而通过扫描入场者脸部的方式来保护人们的压力——鉴于机器学习的性质,所有的脸部识别系统都不可避免地是在各种可能性中转来转去。

Moreover, such systems may be biased against those who do not have white skin, since algorithms trained on data sets of mostly white faces do not work well with different ethnicities.

再者,由于接受的大多是白人脸部的数据集培训的算法不能很好地处理不同的种族特点,这类系统可能沾染上对没有白色皮肤的人的偏见。

Such biases have cropped up in automated assessments used to inform courts' decisions about bail and sentencing.

这类偏见已经在用来公示法庭有关假释和判决的决定的自动评估中露出了苗头。

Eventually, continuous facial recording and gadgets that paint computerised data onto the real world might change the texture of social interactions.

最后,持续不断的脸部记录行为以及能把经过计算机处理的数据投射到现实世界的各种小工具可能改变社会交往的结构。

Dissembling helps grease the wheels of daily life.

掩饰有助于日常生活的润滑。

If your partner can spot every suppressed yawn, and your boss every grimace of irritation, marriages and working relationships will be more truthful, but less harmonious.

如果伴侣能够发现每一个被强压下去的哈欠,如果老板能够发现每一个带着不满的苦相,婚姻和工作关系将会更加诚实,但是,却少了一些和谐。

The basis of social interactions might change, too, from a set of commitments founded on trust to calculations of risk and reward derived from the information a computer attaches to someone's face.

社会交往的基础或许也将改变,从建立在信任基础上的承诺变成对出自计算机给人脸附加的信息之风险和回报的算计。

Relationships might become more rational, but also more transactional.

关系可能变得更加理性,但是也会变得的更具交易性。

In democracies, at least, legislation can help alter the balance of good and bad outcomes.

在民主国家中,至少立法还能够帮助改变好坏结果之间的平衡。

European regulators have embedded a set of principles in forthcoming data-protection regulation, decreeing that biometric information, which would include “faceprints”, belongs to its owner and that its use requires consent—so that, in Europe, unlike America, Facebook could not just sell ads to those car-showroom visitors.

欧洲监管者已经在即将到来的数据保护监管中嵌入了一套原则,这套原则规定,会把“脸纹”包括在内的生物数据信息属于其所有者,并且这些信息的使用需要征得同意——-因此,不同于美国,在欧洲,Facebook不可能向车展游客推送广告。

Laws against discrimination can be applied to an employer screening candidates' images.

针对歧视的法律可能被应用于筛选屏蔽候选人影像的雇主。

Suppliers of commercial face-recognition systems might submit to audits, to demonstrate that their systems are not propagating bias unintentionally.

商用脸部识别系统的供应商可能屈服于审计,以表明他们的系统没在无意识地宣扬偏见。

Firms that use such technologies should be held accountable.

使用这类技术的公司应当被追责。

Such rules cannot alter the direction of travel, however.

然而,这类规则不可能改变行进的方向。

Cameras will only become more common with the spread of wearable devices.

随着可穿戴设备的普及,影像设备只会变得更加常见。

Efforts to bamboozle facial-recognition systems, from sunglasses to make-up, are already being overtaken; research from the University of Cambridge shows that artificial intelligence can reconstruct the facial structures of people in disguise.

从墨镜到易容等各种欺骗脸部识别系统的尝试早已被赶超;剑桥大学的研究显示,人工智能能够重建伪装的人脸结构。

Google has explicitly turned its back on matching faces to identities, for fear of its misuse by undemocratic regimes.

由于担心被非民主政权滥用,Google已经明确地反对让人脸与身份相匹配。

Other tech firms seem less picky.

其他技术公司似乎没有这么挑剔。

Amazon and Microsoft are both using their cloud services to offer face recognition; it is central to Facebook's plans.

亚马逊和微软都在使用它们的云服务来提供脸部识别;这也是脸书各项计划的关键。

Governments will not want to forgo its benefits.

各国政府不会放弃脸部识别的各种好处。

Change is coming.

改变正在到来。

Face up to it.

面对它吧。

================

原文:But the technology also threatens. Researchers at Stanford University have demonstrated that, when shown pictures of one gay man, and one straight man, the algorithm could attribute their sexuality correctly 81% of the time. Humans managed only 61%. In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.

官方译文:但是,技术也是威胁。斯坦福大学的研究表明,在被出示给一张同性恋之人的照片和一张异性恋之人的照片时,算法在81%的时间中可以正确地确定他们的性取向。人类经过努力才能达到61%。在同性恋是一种犯罪的国家中,能从脸部断定性取向的软件是一种令人担忧的前景。

高斋翻修订后的译文:但技术也是种威胁。斯坦福大学的研究表明,出示一张同性恋者的照片和一张异性恋者的照片,机器算法能准确判断他们的性取向,准确率达81%,而人类勉强能达到61%。在同性恋是一种犯罪的国家,软件能从脸部识别性取向的前景令人担忧。

1. 背景知识:人工智能“gay达”可凭一张照片判断性取向,准确率达81%。人类在这方面的的判断表现逊于机器算法,其判断男性性向的准确率仅为61%,女性的为54%。

2. Manage:VERB 勉强做出(微笑、寒暄等); 强作 
If you say that someone managed a particular response, such as a laugh or a greeting, you mean that it was difficult for them to do it because they were feeling sad or upset.

He looked dazed as he spoke to reporters, managing only a weak smile...

跟记者谈话的时候,他只是勉强挤出一丝淡淡的微笑,显得非常恍惚。

猜你喜欢

转载自www.cnblogs.com/wanghui626/p/9370463.html
今日推荐