AI chatbots could be 'easily be programmed' to groom young men into launching terror attacks, warns top lawyer

Featured in The Daily Mail



Raffaello Pantucci, a counter- terrorism expert at the Royal United Services Institute (RUSI) think tank, said: 'The danger with AI like ChatGPT is that it could enhance a 'lone actor terrorist', as it would provide a perfect foil for someone seeking understanding by themselves but worried about talking to others.' On the question of whether an AI company can be held responsible if a terrorist should launch an attack after being groomed by a bot, Mr Pantucci explained: 'My view is that it is a bit difficult to blame the company, as I am not entirely sure they are able to control the machine themselves.'