An Analysis of Scientific Requirements on Artificial Intelligence Governance
Kratka vsebina
The emergence of AI technology has prompted the need for standardization and governance due to the potential societal risks associated with its use. However, there is currently no common concept for AI standardization, that considers a broad range of social and ethical subject areas. International cooperation is necessary to address the possible threats, and various nations and organizations have already made initial efforts in this direction. Our overall research question investigates, to what extent requirements based on scientific insights have been addressed in international standards and what new insights standardization efforts can offer to science. In this paper we report the findings of an extensive systematic literature review of 482 scientific articles, using a hybrid analysis process combining manual coding with generative AI supported triangulation steps. The resulting 17 requirements will be used as a basis for a thematic analysis of the most relevant AI standards currently being developed and deployed globally.