{"id":21943,"date":"2023-09-25T06:09:14","date_gmt":"2023-09-25T06:09:14","guid":{"rendered":"https:\/\/web3unplugged.io\/blog\/?p=21943"},"modified":"2023-09-25T06:09:15","modified_gmt":"2023-09-25T06:09:15","slug":"experts-disagree-over-threat-posed-but-artificial-intelligence-cannot-be-ignored","status":"publish","type":"post","link":"https:\/\/web3unplugged.io\/blog\/experts-disagree-over-threat-posed-but-artificial-intelligence-cannot-be-ignored\/","title":{"rendered":"Experts disagree over threat posed but artificial intelligence cannot be ignored"},"content":{"rendered":"\n<p>For some AI experts, a watershed moment in artificial intelligence development is not far away. And the global AI safety summit, to be held at Bletchley Park in Buckinghamshire in November, therefore cannot come soon enough.<\/p>\n\n\n\n<p>Ian Hogarth, the chair of the UK taskforce charged with scrutinising the safety of cutting-edge AI, raised concerns before he took the job this year about artificial general intelligence, or \u201cGod-like\u201d AI. Definitions of AGI vary but broadly it refers to an AI system that can perform a task at a human, or above human, level \u2013 and could evade our control.<\/p>\n\n\n\n<p>Max Tegmark, the scientist behind a headline-grabbing letter this year calling for a pause in large AI experiments, told the Guardian that tech professionals in California believe AGI is close.<\/p>\n\n\n\n<p>\u201cA lot of people here think that we\u2019re going to get to God-like artificial general intelligence in maybe three years. Some think maybe two years.\u201d<\/p>\n\n\n\n<p>He added: \u201cSome think it\u2019s going to take a longer time and won\u2019t happen until 2030.\u201d Which doesn\u2019t seem very far away either.<\/p>\n\n\n\n<p>There are also respected voices who think the clamour over AGI is being overplayed. According to one of those counterarguments, the noise is a cynical ploy to regulate and fence off the market and consolidate the position of big players like ChatGPT developer OpenAI, Google and Microsoft.<\/p>\n\n\n\n<p>The Distributed AI Research Institute has warned that focusing on existential risk ignores immediate impacts from AI systems such as: using artists\u2019 and authors\u2019 work without permission in order to build AI models; and using low-paid workers to carry out some of the model-building tasks. Timnit Gebru, founder and executive director of DAIR, last week praised a US senator for raising concerns over working conditions for data workers rather than focusing on \u201cexistential risk nonsense\u201d.<\/p>\n\n\n\n<p>Another view is that uncontrollable AGI simply won\u2019t happen.<\/p>\n\n\n\n<p>\u201cUncontrollable artificial general intelligence is science fiction and not reality,\u201d said William Dally, the chief scientist at the AI chipmaker Nvidia, at a US senate hearing last week. \u201cHumans will always decide how much decision-making power to cede to AI models.\u201d<\/p>\n\n\n\n<p>However, for those who disagree, the threat posed by AGI cannot be ignored. Fears about such systems include refusing \u2013 and evading \u2013 being switched off, combining with other AIs or being able to improve themselves autonomously. Connor Leahy, the chief executive of the AI safety research company Conjecture, said the problem was more simple than that.<\/p>\n\n\n\n<p>\u201cThe deep issue with AGI is not that it\u2019s evil or has a specifically dangerous aspect that you need to take out. It\u2019s the fact that it is competent. If you cannot control a competent, human-level AI then it is by definition dangerous,\u201d he said.<\/p>\n\n\n\n<p>Other concerns held by UK government officials are that the next iteration of AI models, below the AGI level, could be manipulated by rogue actors to produce serious threats such as bioweapons. Open source AI, where the models underpinning the technology are freely available and modifiable, is a related concern.<\/p>\n\n\n\n<p>Civil servants say they are also working on combating nearer-term risks, such as disinformation and copyright infringements. But with international leaders arriving at Bletchley Park in a few weeks\u2019 time, Downing Street wants to focus the world\u2019s attention on something officials believe is not being taken seriously enough in policy circles: the chance that machines could cause serious damage to humanity.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>For some AI experts, a watershed moment in artificial intelligence development is not far away. And the global AI safety summit, to be held at Bletchley Park in Buckinghamshire in November, therefore cannot come soon enough. Ian Hogarth, the chair of the UK taskforce charged with scrutinising the safety of cutting-edge AI, raised concerns before [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":21945,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_seopress_robots_primary_cat":"none","_seopress_titles_title":"","_seopress_titles_desc":"","_seopress_robots_index":"","footnotes":""},"categories":[2],"tags":[],"class_list":["post-21943","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news"],"rttpg_featured_image_url":{"full":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2023\/09\/Untitled-4-5.jpg",1240,744,false],"landscape":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2023\/09\/Untitled-4-5.jpg",1240,744,false],"portraits":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2023\/09\/Untitled-4-5.jpg",1240,744,false],"thumbnail":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2023\/09\/Untitled-4-5-150x150.jpg",150,150,true],"medium":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2023\/09\/Untitled-4-5-300x180.jpg",300,180,true],"large":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2023\/09\/Untitled-4-5-1024x614.jpg",1024,614,true],"1536x1536":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2023\/09\/Untitled-4-5.jpg",1240,744,false],"2048x2048":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2023\/09\/Untitled-4-5.jpg",1240,744,false],"post-thumbnail":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2023\/09\/Untitled-4-5.jpg",700,420,false],"graptor-sq-xs":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2023\/09\/Untitled-4-5.jpg",100,60,false]},"rttpg_author":{"display_name":"Admin CG","author_link":"https:\/\/web3unplugged.io\/blog\/author\/admin-cg\/"},"rttpg_comment":0,"rttpg_category":"<a href=\"https:\/\/web3unplugged.io\/blog\/category\/news\/\" rel=\"category tag\">news<\/a>","rttpg_excerpt":"For some AI experts, a watershed moment in artificial intelligence development is not far away. And the global AI safety summit, to be held at Bletchley Park in Buckinghamshire in November, therefore cannot come soon enough. Ian Hogarth, the chair of the UK taskforce charged with scrutinising the safety of cutting-edge AI, raised concerns before&hellip;","_links":{"self":[{"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/posts\/21943","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/comments?post=21943"}],"version-history":[{"count":1,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/posts\/21943\/revisions"}],"predecessor-version":[{"id":21946,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/posts\/21943\/revisions\/21946"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/media\/21945"}],"wp:attachment":[{"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/media?parent=21943"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/categories?post=21943"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/tags?post=21943"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}