Call for proofs of concept or demos on ethical aspects of smart learning ecosystems (SLE) (emphasis on algorithmic AI solutions)
For this special edition of the Student Design Context organized by the Tallinn University individuals and teams of students (master’s and PhD students) are invited to submit proofs of concept or demos on ethical aspects of smart learning ecosystems with emphasis on algorithmic (AI) solutions. Authors of all accepted proposal are expected to present their proof of concepts or demos at the SLERD 2023 conference. The jury will select the three finalists during the demo session, and the description of the demos will be disseminated through SLERD and TLU.
ASLERD, as a sponsor of the Student Demo Contest, offers the first prize – 500 € – that will be awarded to the winner, individual or team (as refund for the expenses to demonstrate the concept at the SLERD 2023 conference) .
DEMO CONTEXT and GOALS
Artificial Intelligence in its varied forms is not just a trendy research topic; it is already incorporated into everyday life tools. However, users may not notice or be aware of it, which may have relevant ethical implications. Therefore, we invite to provide us your view on how the ethics of learning in Smrt Learning Environment enriched by AI should be taken into consideration. How should we interact with tools using machine learning, learning analytics, neural networks, and so forth? Should we, and how should we know which kinds of ethics are in use, how the ethical principles are used, in which phases of the learning process?
The European Commission has provided guidelines for teachers on how to teach about ethics and AI (see: Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and learning for educators, from EU). The project “AI for learning” investigated the view of educational technology providers, teachers, and students on ethical issues. They found that EdTech companies hoped that all involved parties – themselves, consumers, educational institutions, researchers, funders, and decision-makers would collaborate to overcome the ethical challenges of AI (Kousa & Niemi, 2022).
Many applications are directed at teachers (Niemi, 2021); we aim to hear and see students’ perspectives on the tools and ethics of smart learning ecosystems. The main ethical principles are (Niemi, 2021; Akgun & Greenhow, 2022): Beneficence: promoting well-being, preserving dignity, and sustaining the planet; Non-maleficence: privacy, security, and “capability caution.”; Autonomy: the power to decide (or whether to decide); Justice: promoting prosperity and preserving solidarity, ensuring that AI creates shared benefits (or at least shareable), preventing the creation of new harms; Explicability: enabling the other principles through intelligibility and accountability – “transparency,” “accountability,” “intelligibility,” “understandable and interpretable.”
The jury will select the three best proofs of concept/demos during the demo session that will be awarded with an e-certificate blockchain anchored. Based on the presentations and prototype demonstrations, the jury will announce the contest winner that will be awarded with the first prize: 500 €.
ASLERD will promote the dissemination of the three best proofs of concept/prototypes on its website, social media, and mailing list.
Additional prizes could be announced in the next future.
Submission of the abstract of the demos: 5th March 2023, 23:59 GMT
Acceptance to Student Demo–Design Contest: 16th April 2023
Submission of final proofs of concept and prototypes: 30th April 2023
Presentation and demo at SLERD 2023 and winner announcement: during the conference.
RULES, HOW AND WHAT TO SUBMIT
Participation is open to all University students: master’s and PhD. Individuals or teams can submit proposals. Teams cannot exceed three members. Students interested in participating in the contest must submit a paper describing the problem that their demo solves, the theoretical background and which ethical issues are made transparent by the demo. The prototype should be described by a video no longer than 3 minutes and by a paper no longer than eight (8) A4 pages, included sketches, figures and links to demos. (In the IxD&A format: see authors’ guidelines).
The demos will be evaluated based on the following criteria:
- Students are the main target group
- The demo has the potential for implementation
- The demo is demonstrable, preferable can be interacted with by conference participants
- Ethical issues are made transparent – made understandably
Abstracts will be selected based on
- the clearness of the problem set,
- the implementation possibility of the proposed solution,
- the student-centred perspective,
- the interactivity of the prototype, and
- the organisation and readability of the description.
Submission by email to: bauters [at] tlu [dot] ee
Marwa Soudi (Chair), IdeasGYM, TLU Egypt
Mihai Dascalu, University Politehnica of Bucharest (telepresence)
Tania Di Mascio, University of L’Aquila
Gabriella Dodero, ASLERD (telepresence)
Päivi Kousa, Jyväskylä University
• Akgun, S. & Greenhow, C. (2022=. Artificial intelligence in education: Addressing ethical challenges in K-12 settings. AI Ethics. 2(3):431-440. doi: 10.1007/s43681-021-00096-7. Epub 2021 Sep 22. PMID: 34790956; PMCID: PMC8455229.
• European Commission, Directorate-General for Education, Youth, Sport and Culture, Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and learning for educators. (2022).Publications Office of the European Union. https://data.europa.eu/doi/10.2766/153756
• Kousa, P., & Niemi, H. (2022). AI ethics and learning: EdTech companies’ challenges and solutions. Interactive Learning Environments, 1-12.
• Niemi, H. (2021). AI in learning: Preparing grounds for future learning. Journal of Pacific Rim Psychology, 15. https://doi.org/10.1177/18344909211038105;