{"id":3252,"date":"2021-04-16T02:35:37","date_gmt":"2021-04-16T02:35:37","guid":{"rendered":"https:\/\/www.tech-battery.com\/batteriesblog\/?p=3252"},"modified":"2021-04-16T02:35:37","modified_gmt":"2021-04-16T02:35:37","slug":"europe-seeks-to-limit-use-of-ai-in-society","status":"publish","type":"post","link":"https:\/\/www.tech-battery.com\/batteriesblog\/europe-seeks-to-limit-use-of-ai-in-society\/","title":{"rendered":"Europe seeks to limit use of AI in society"},"content":{"rendered":"\n<p>The use of facial recognition for surveillance, or algorithms that manipulate human behaviour, will be banned under proposed EU regulations on artificial intelligence.<\/p>\n\n\n\n<p>The wide-ranging proposals, which were leaked ahead of their official publication, also promised tough new rules for what they deem high-risk AI.<\/p>\n\n\n\n<p>That includes algorithms used by the police and in recruitment.<\/p>\n\n\n\n<p>Experts said the rules were vague and contained loopholes.<\/p>\n\n\n\n<p>The use of AI in the military is exempt, as are systems used by authorities in order to safeguard public security.<\/p>\n\n\n\n<p>The suggested list of banned AI systems includes:<\/p>\n\n\n\n<p>those designed or used in a manner that manipulates human behaviour, opinions or decisions \u2026causing a person to behave, form an opinion or take a decision to their detriment<br>\nAI systems used for indiscriminate surveillance applied in a generalised manner<br>\nAI systems used for social scoring<br>\nthose that exploit information or predictions and a person or group of persons in order to target their vulnerabilities<br>\nEuropean policy analyst Daniel Leufer tweeted that the definitions were very open to interpretation.<\/p>\n\n\n\n<p>&#8220;How do we determine what is to somebody&#8217;s detriment? And who assesses this?&#8221; he wrote.<\/p>\n\n\n\n<p>For AI deemed to be high risk, member states would have to apply far more oversight, including the need to appoint assessment bodies to test, certify and inspect these systems.<\/p>\n\n\n\n<p>And any companies that develop prohibited services, or fail to supply correct information about them, could face fines of up to 4% of their global revenue, similar to fines for GDPR breaches.<\/p>\n\n\n\n<p>High-risk examples of AI include:<\/p>\n\n\n\n<p>systems which establish priority in the dispatching of emergency services<br>\nsystems determining access to or assigning people to educational institutes<br>\nrecruitment algorithms<br>\nthose that evaluate credit worthiness<br>\nthose for making individual risk assessments<br>\ncrime-predicting algorithms<br>\nMr Leufer added that the proposals should &#8220;be expanded to include all public sector AI systems, regardless of their assigned risk level&#8221;.<\/p>\n\n\n\n<p>&#8220;This is because people typically do not have a choice about whether or not to interact with an AI system in the public sector.&#8221;<\/p>\n\n\n\n<p>As well as requiring that new AI systems have human oversight, the EC is also proposing that high risk AI systems have a so-called kill switch, which could either be a stop button or some other procedure to instantly turn the system off if needed.<\/p>\n\n\n\n<p>&#8220;AI vendors will be extremely focussed on these proposals, as it will require a fundamental shift in how AI is designed,&#8221; said Herbert Swaniker, a lawyer at Clifford Chance.<\/p>\n\n\n\n<p>Sloppy and dangerous<br>\nMeanwhile Michael Veale, a lecturer in digital rights and regulation at University College London, highlighted a clause that will force organisations to disclose when they are using deepfakes, a particularly controversial use of AI to create fake humans or to manipulate images and videos of real people.<\/p>\n\n\n\n<p>He also told the BBC that the legislation was primarily &#8220;aimed at vendors and consultants selling &#8211; often nonsense- AI technology to schools, hospitals, police and employers&#8221;.<\/p>\n\n\n\n<p>But he added that tech firms who used AI &#8220;to manipulate users&#8221; may also have to change their practices.<\/p>\n\n\n\n<p>With this legislation, the EC has had to walk a difficult tightrope between ensuring AI is used for what it calls &#8220;a tool\u2026 with the ultimate aim of increasing human wellbeing&#8221;, and also ensuring it doesn&#8217;t stop EU countries competing with the US and China over technological innovations.<\/p>\n\n\n\n<p>And it acknowledged that AI already informed many aspects of our lives.<\/p>\n\n\n\n<p>The European Centre for Not-for-Profit Law, which had contributed to the European Commission&#8217;s White Paper on AI, told the BBC that there was &#8220;lots of vagueness and loopholes&#8221; in the proposed legislation.<\/p>\n\n\n\n<p>&#8220;The EU&#8217;s approach to binary-defining high versus low risk is sloppy at best and dangerous at worst, as it lacks context and nuances needed for the complex AI ecosystem already existing today.<\/p>\n\n\n\n<p>&#8220;First, the commission should consider risks of AI systems within a rights-based framework &#8211; as risks they pose to human rights, rule of law and democracy.<\/p>\n\n\n\n<p>&#8220;Second, the commission should reject an oversimplified low-high risk structure and consider a tier-based approach on the levels of AI risk.&#8221;<\/p>\n\n\n\n<p>The details could change again before the rules are officially unveiled next week. And it is unlikely to become law for several more years.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The use of facial recognition for surveillance, or algorithms that manipulate human behaviour, will be banned under proposed EU regulations on artificial intelligence. The wide-ranging proposals, which were leaked ahead of their official publication, also promised tough new rules for what they deem high-risk AI. That includes algorithms used by the police and in recruitment. &hellip; <a href=\"https:\/\/www.tech-battery.com\/batteriesblog\/europe-seeks-to-limit-use-of-ai-in-society\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Europe seeks to limit use of AI in society&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-3252","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/www.tech-battery.com\/batteriesblog\/wp-json\/wp\/v2\/posts\/3252","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.tech-battery.com\/batteriesblog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tech-battery.com\/batteriesblog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tech-battery.com\/batteriesblog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tech-battery.com\/batteriesblog\/wp-json\/wp\/v2\/comments?post=3252"}],"version-history":[{"count":1,"href":"https:\/\/www.tech-battery.com\/batteriesblog\/wp-json\/wp\/v2\/posts\/3252\/revisions"}],"predecessor-version":[{"id":3253,"href":"https:\/\/www.tech-battery.com\/batteriesblog\/wp-json\/wp\/v2\/posts\/3252\/revisions\/3253"}],"wp:attachment":[{"href":"https:\/\/www.tech-battery.com\/batteriesblog\/wp-json\/wp\/v2\/media?parent=3252"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tech-battery.com\/batteriesblog\/wp-json\/wp\/v2\/categories?post=3252"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tech-battery.com\/batteriesblog\/wp-json\/wp\/v2\/tags?post=3252"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}