The citizen literacy project has concluded the survey analysis part of the project and preliminary results are now available.
The results are very much as we expected but also very concerning. Due to the lack of expert, validated resources and any robust and consistent teaching and assessment people are searching the media for answers.
The media is very much on the ‘tech bro train’ which means that articles and information will be more positively skewed towards the hype around AI. What this does not give the reader is any awareness of the risks of AI. With the EU AI Act coming into force and various pieces of regulation and legislation being discussed, it is a matter of time before people are held accountable. The people that will be held accountable are those at the ‘coalface’ who have not have the level of training required to use or develop AI in an ethical and robust way. This means that the untrained person who is unaware of the vast amount of risks associated with AI will end up being held to account for any mistakes.
We raised this as a serious concern at BSI and CEN CENELEC. The UK and EU standards bodies respectively.
We believe that this is unacceptable and that full training should be provided to all citizens. Parents should have the right resources to understand the risks associated with AI in order to protect and educate their children. Professionals should have the right awareness in order to protect society, and indeed themselves against comeback and to be able to provide robust and transparent solutions.
Overwhelmingly the respondents to our study requested resources. This is echoed by the student feedback in the London Mathematical Society/LSE Research School.
Fratcal AI will now begin work at pace with it’s members to generate robust resources and training so that we can begin to plug the gaps that have been highlighted by this project.
The paper on this will be published in due course.
To get involved drop us an email and sign up as a member.
We thanks Sprite + for the funding to enable this vital work.