A governance system can be understood as the full set of institutional arrangements, including rules and agents who create them, that regulate transactions within and across the boundaries of economic systems (Hollingsworth, Schmitter & Streeck, 1994). These arrangements encompass both state and non-state organizations and operate through formal and informal rules, norms, and beliefs. In the context of artificial intelligence (AI), governance systems determine how data is collected, shared, and safeguarded, how algorithms are trained and deployed, and how accountability is ensured (Shin & Ahmad, 2025). Effective AI governance is therefore critical to balancing innovation with ethical, legal, and social considerations.
New AI-related regulative institutions are rapidly expanding to address these concerns. Some focus narrowly on specific activities, such as employment, while others provide comprehensive frameworks covering the full spectrum of AI use. For instance, in April 2023, New York City introduced definitive guidelines governing the use of automated employment decision tools in hiring and promotion (Paretti, Ray, Freedberg & McPike, 2023). At the same time, broader regulatory initiatives are underway in major jurisdictions including China, the EU, Japan, the U.K., and the U.S., each seeking to establish rules that ensure AI systems are safe, transparent, and accountable.
In many cases, normative frameworks and rules have emerged to fill regulatory gaps, especially where formal agencies remain underdeveloped or absent (Kshetri, 2024). These mechanisms are typically prescriptive rather than coercive, guiding behavior without the force of law. Such institutions include voluntary guidelines and codes of conduct, technical standards, and certification programs, all of which provide structure and accountability in the absence of comprehensive regulation.
As AI technologies rapidly expand across sectors such as healthcare, finance, education, defense, and government, the need to safeguard responsible use, transparency, and accountability has become more pressing than ever. Yet, despite growing recognition of these challenges, governance mechanisms remain at an early stage of development. Regulatory and oversight frameworks often lag behind technological advances, leaving ethical, legal, and operational blind spots that can undermine trust and exacerbate risks.
This special issue of IEEE Computer will address the global conversation around AI governance and compliance. The issue aims to bring together interdisciplinary voices from policy, academia, industry, and civil society to explore strategies for regulating, auditing, and governing AI systems to ensure alignment with human values, social norms, and legal expectations.
AI governance is not only a matter of technical risk management but also of societal trust and democratic accountability (Floridi et al., 2018; Mittelstadt, 2019; Shin et. al, 2024). This issue will spotlight global efforts to develop comprehensive regulatory frameworks such as the EU AI Act (European Commission, 2021), the NIST AI Risk Management Framework (NIST, 2023), and efforts by organizations like OECD, ISO/IEC, and IEEE Standards Association.
We invite high-quality, original contributions that explore topics including, but not limited to the following:
For author information and guidelines on submission criteria, visit the Author’s Information Page. Please submit papers through the IEEE Author Portal and be sure to select the special issue or special section name. Manuscripts should not be published or currently submitted for publication elsewhere. Please submit only full papers intended for review, not abstracts.
In addition to submitting your paper to Computer, you are also encouraged to upload the data related to your paper to IEEE DataPort. IEEE DataPort is IEEE's data platform that supports the storage and publishing of datasets while also providing access to thousands of research datasets. Uploading your dataset to IEEE DataPort will strengthen your paper and will support research reproducibility. Your paper and the dataset can be linked, providing a good opportunity for you to increase the number of citations you receive. Data can be uploaded to IEEE DataPort prior to submitting your paper or concurrent with the paper submission. Thank you!
Contact the guest editors at: