Volume 54, Issue 6 p. 1653-1670
ORIGINAL ARTICLE
Open Access

The role of design ethics in maintaining students' privacy: A call to action to learning designers in higher education

Ahmed Lachheb

Corresponding Author

Ahmed Lachheb

The Center for Academic Innovation, The University of Michigan, Ann Arbor, Michigan, USA

Correspondence

Ahmed Lachheb, The University of Michigan, Ann Arbor, MI 48109-1382, USA.

Email: [email protected]

Search for more papers by this author
Victoria Abramenka-Lachheb

Victoria Abramenka-Lachheb

The University of Michigan, Ann Arbor, Michigan, USA

Search for more papers by this author
Stephanie Moore

Stephanie Moore

Organization, Information, and Learning Sciences, The University of New Mexico, Albuquerque, New Mexico, USA

Search for more papers by this author
Colin Gray

Colin Gray

Informatics, Indiana University, Bloomington, Indiana, USA

Search for more papers by this author
First published: 28 August 2023

Authors' Foreword: Throughout this paper, we refer to learning design (as a profession and discipline/field of design practice) and as learning designers (as professionals of learning design) as synonymous with instructional design and instructional designers.

Abstract

Maintaining students' privacy in higher education, an integral aspect of learning design and technology integration, is not only a matter of policy and law but also a matter of design ethics. Similar to faculty educators, learning designers in higher education play a vital role in maintaining students' privacy by designing learning experiences that rely on online technology integration. Like other professional designers, they need to care for the humans they design for by not producing designs that infringe on their privacy, thus, not causing harm. Recognizing that widely used instructional design models are silent on the topic and do not address ethical considerations such as privacy, we focus this paper on how design ethics can be leveraged by learning designers in higher education in a practical manner, illustrated through authentic examples. We highlight where the ethical responsibility of learning designers comes into the foreground when maintaining students' privacy and well-being, especially in online settings. We outline an existing ethical decision-making framework and show how learning designers can use it as a call to action to protect the students they design for, strengthening their ethical design capacity.

Practitioner notes

What is already known about this topic

  • Existing codes of ethical standards from well-known learning design organizations call upon learning designers to protect students' privacy without clear guidance on how to do so.
  • Design ethics within learning design is often discussed in abstract ways with principles that are difficult to apply.
  • Most, if not all, design models that learning design professionals have learned are either silent on design ethics and/or do not consider ethics as a valid dimension, thus, making design ethics mostly excluded from learning design graduate programs.
  • Practical means for engaging in ethical design practice are scarce in the field.

What this paper adds

  • A call for learning designers in higher education to maintain and protect students' privacy and well-being, strengthening their ethical design capacity.
  • A demonstration of how to use a practical ethical decision-making framework as a designerly tool in designing for learning to maintain and protect students' privacy and well-being.
  • Authentic examples—in the form of vignettes—of ethical dilemmas/issues that learning designers in higher education could face, focused on students' privacy.
  • Methods—using a practical ethical decision-making framework—for learning design professionals in higher education, grounded in the philosophy of designers as the guarantors of designs, to be employed to detect situations where students' privacy and best interests are at risk.
  • A demonstration of how learning designers could make stellar design decisions in service to the students they design for and not to the priorities of other design stakeholders.

Implications for practice and/or policy

  • Higher education programs/institutions that prepare/employ learning designers ought to treat the topics of the designer's responsibility and design ethics more explicitly and practically as one of the means to maintain and protect students' privacy, in addition to law and policies.
  • Learning designers in higher education ought to hold a powerful position in their professional practice to maintain and protect students' privacy and well-being, as an important aspect of their ethical design responsibilities.
  • Learning designers in higher education ought to adopt a design thinking mindset in order to protect students' privacy by (1) challenging ideas and assumptions regarding technology integration in general and (2) detecting what is known in User Experience (UX) design as “dark patterns” in online course design.

INTRODUCTION

Maintaining students' privacy in higher education is not only a matter of policy; every policy could be violated especially when it includes “blind spots”, thus, failing to act as a safeguard as it ought to be. Design ethics are also important in addressing issues of students' privacy in higher education (Moore & Tillberg-Webb, 2023), especially when their designed learning experiences heavily rely on online technologies. Similar to faculty educators, learning designers in higher education play a vital role in maintaining students' privacy by designing learning experiences that rely on online technology integration. Like other professional designers, they need to care for the humans they design for by not producing designs that infringe on their privacy (Blackmon & Major, 2023), thus, not causing harm (Kozma, 2023). As the role of learning design professionals continues to gain importance in higher education institutions for the design and development of diverse learning experiences and programs (Ritzhaupt et al., 2021), growing demand for learning design, learning designers' work continues to have a larger impact on learners, educators and educational/learning systems at large. Now more than ever, learning designers have consequential responsibility to make ethically driven design decisions to maintain students' privacy, especially in online settings.

Indeed, students' privacy is an integral aspect of learning design and technology integration in higher education, but achieving it is more difficult than it may seem. Therefore, the question remains how can learning designers leverage design ethics to maintain students' privacy? When examining ethical guidelines from professional/academic associations, practical means for engaging in ethical learning design practice are found to be opaque. For instance, the Code of Ethics of the Association for Educational Communications and Technology (AECT, 2018) explicitly states that an AECT member “Shall conduct professional activities so as to protect the privacy and maintain the personal integrity of the individual learner.” (p. 1). Yet, the code falls short of providing practical guidance for learning designers on how to protect learners' privacy. Similarly, the International Board of Standards for Training, Performance and Instruction (IBSTPI) explicitly indicates that “instructional designers do have ethical obligations and responsibilities and should be aware of them” (Spector et al., 2006, p. 7) and calls on designers to “Protect the privacy, candor and confidentiality of client and colleague information and communication” (IBSTPI, 2012); thus, making design ethics one of the essential competencies under the professional foundations (IBSTPI, 2012). Yet, this code also falls short in detailing how an instructional designer might protect the privacy of the client.

The same question of how can learning designers leverage design ethics to maintain students' privacy becomes further complicated when examining the well-known/used design models, frameworks and theories that learning designers learn to use for their practice. Design ethics within learning design is often discussed in abstract ways with principles that are difficult to apply. Most design models are either silent on the topic and/or do not consider ethics as a valid dimension (Moore, 2021), thus, making design ethics mostly excluded from learning design graduate programs. Therefore, the problematic situation arises when professional learning designers in higher education lack extensive training or at least a general awareness of design ethics; they become at risk of operating in service to powerful entities in their design projects, placing students' privacy, needs and best interests as a secondary priority.

THE GOAL OF THE PAPER

In this paper, we share and demonstrate the use of a practical ethical decision-making framework that can be used day-to-day as a designerly tool (Stolterman et al., 2009) to maintain students' privacy in designing for learning in higher education. Through authentic examples—in the form of vignettes— and grounded in the philosophy of designers as the guarantors of designs (Nelson & Stolterman, 2014), we demonstrate methods—using the practical ethical decision-making framework—for learning design professionals in higher education to detect situations where students' privacy is at risk, and how they could make stellar design decisions in service to the students they design for, and not to the priorities of other stakeholders. We begin by first defining design ethics. Then, we outline a design thinking mindset that learning designers working in the higher education context ought to subscribe to in order to maintain students' privacy by (1) challenging ideas and assumptions regarding technology integration in general and (2) detecting what is known in UX design as “dark patterns” in online course design (Gray et al., 2021).

Challenging ideas and assumptions regarding technology integration is the practice of questioning the utility and the promised narrative of technology and how it impacts users (Moore & Tillberg-Webb, 2023). For example, commonly used web conferencing tools—while supporting the narrative of engaged audio-visual communication—also raise questions of communication privacy and personal data, with contemporary examples of these data being sold or used to train generative artificial technologies (Field, 2023). Dark patterns represent an increasingly common framing of manipulative technology practices, “where design choices subvert, impair, or distort the ability of a user to make autonomous and informed choices in relation to digital systems regardless of the designer's intent” (Gray et al., 2023). For example, an online learning experience could nudge learners to buy a premium subscription to a platform to enjoy interpreted access to course content and “enhanced privacy settings”. We elaborate more on challenging ideas and assumptions regarding technology integration and on dark patterns through examination of designing for students' privacy and students' well-being, especially in online settings—the two contexts where the ethical responsibility of learning designers comes to the foreground.

DEFINING DESIGN ETHICS

Generally speaking, ethics describes the broad space of philosophical inquiry on how we should live and what good human life should be (Chan, 2018). More specifically, design ethics includes standards and practices that guide how designers should act when they face many situations in their professional practice. Design ethics, according to Parsons (2016), can be used to refer to three types of relations between a designer and ethics: (1) the ethical issues/conflicts emerging when the designers apply the rules and norms to what they are designing; (2) interrogating designers' choices of what they are designing; (3) and contemplating how design could change the existing notions of ethics (Chan, 2018).

Ethics is different from morality, albeit these terms are often used synonymously or interchangeably. Morality is the strict adherence to social rules/norms that aim to compel public human behaviour (Parsons, 2016) (eg, requiring students to turn on their webcams during an online class meeting to mirror classroom settings where everyone can see everyone). Ethics is also not science; the latter does not tell us what we should do, as what could be scientifically possible could be unethical (Markkula Center for Applied Ethics, 2021) (eg, The use of algorithms to detect/analyse hand gestures on camera and transcribe them into emojis or text; thus, having the capacity to detect and analyse everything on camera, including all body movements, facial expressions and what is on the background). Ethics is also not the same as law or religion. Laws and religious people's practices can be distant from what is ethical, despite the fact that these two could be good sources and places for ethical behaviours (eg, A law can be ethically corrupt when it produces power for online technology companies, allowing them to harvest students' data and sell them for profit). Many people do not subscribe to a specific religion; yet, they act ethically and vice versa (Markkula Center for Applied Ethics, 2021). Kaurin, a moral philosopher, has suggested that ethics are better thought of as a process of “reflection, critical questioning, justification, argumentation and application of” morals, laws and values (2018, p. 4). Moore and Tillberg-Webb (2023) applied this definition of ethics as a process of reflection to learning design, suggesting that ethics may be better thought of as a form of reflective practice much like other forms of reflection-in-action that designers engage in rather than a set of rules to be followed (Schön, 1983; Tracey & Baaki, 2014).

Because morality, science, law and religion are different than ethics, one could wonder what represents the basis of ethics. This question has led philosophers, ethicists and theologians to suggest several lenses that help us perceive ethical dimensions, as presented in the following table (Table 1). With these lenses, the earlier question of how can learning designers leverage design ethics to maintain students' privacy becomes easier to address when examining the Right and the Care lenses as outlined in Table 1.

TABLE 1. Ethical lenses (as outlined by the Markkula Center for Applied Ethics, 2021).
Lens Explanation
Rights Humans have dignity based on their human nature and their ability to choose freely
Justice Each person should be given fair or equal treatment
Utilitarian An ethical action is the one that produces the greatest balance of good over harm for as many stakeholders as possible
Common good The interlocking relationships of society are the basis of ethical reasoning and that respect and compassion for all others—especially the vulnerable—are requirements of common good reasoning
Virtue Ethical actions ought to be consistent with certain ideal virtues that enable us to act according to the highest potential of our character and values, like truth, beauty, honesty, courage, compassion, generosity, tolerance, love, fidelity, etc.
Care Relationships and the need for one to listen and respond to individuals in their specific circumstances rather than merely following rules or calculating utility. This is when we talk about empathy to gain a deep appreciation of people's interests, feelings and viewpoints

A DESIGNER MINDSET AS A FOUNDATION FOR ETHICALLY DRIVEN LEARNING DESIGNER

To be an ethically driven learning designer in higher education (or in another professional context), capable of engaging with ethical dilemmas and problems—such as students' privacy—one first should have a mindset of a designer (Boling et al., 2022; Dorst, 2011). This mindset of a designer means that a learning designer should consider themselves as sharing common ground with professional designers in other respected design disciplines (eg, architecture). A learning designer works in a professional design discipline that has a broad and significant impact on human learning and performance in diverse contexts and on society in general. Learning designers are not simply technicians of tools and techniques that support learning—they ought to think about students' privacy as a core issue when designing learning experiences that heavily rely on online technology.

Nevertheless, embracing this professional identity of a designer could be problematic. It can bring tension to the individual designer or their collaborators if a particular learning designer lacks the training and qualifications (often coming in the form of advanced academic credentials) to work and act as a professional designer. Therefore, completing the required rigorous training to become a learning designer (which could be achieved effectively through enrolling in a higher education program that is well-positioned for preparing future learning designers) would be the reasonable solution to this problem, as Kim (2018) aptly noted:

Of course you hire the [learning] designer who has been trained to be a [learning] designer. A professional with training in the science of learning. An expert on the theoretical frameworks and research-validated methods for course design. Someone who has been immersed [in] the literature on learning. A devotee of the scholarship of teaching and learning (SoTL). A skeptic of the efficacy of digital learning technologies, but a skepticism born out of deep expertise with the operation of the technologies and exposure to the data on their effectiveness. [emphasis added] (pp. 3–4)

Clearly, there is a responsibility that comes with rigorous and advanced training—the responsibility for design success and/or failure (Lachheb, 2020; Petroski, 1985, 1989, 2001, 2006, 2012) or as framed by Nelson and Stolterman (2014) being the “guarantor of design” (p. 201) (ie, the human responsible for design success and/or failure). For example, a learning designer is responsible for making sure that a course/training they designed does not include an online technology that poses risks to their privacy, thus, harming their well-being. Therefore, they become the major responsible for the successful or failed outcomes of their design work. In this sense, learning designers in higher education should be equipped to own their design work and treat their contribution to every design project as instrumental in the lives of those whose design work serves, mainly learners and educators. No matter the scale of the project and the power of stakeholders, a learning designer has the ethical obligation to be the guarantor of their design, as Nelson and Stolterman (2014) eloquently articulated:

To be a designer is therefore to be the co-creator of a new world. It is a calling of enormous responsibility, with its concomitant accountability. This is true even if each individual designer is only involved in a very small design act, playing merely a minor part in the totality of the redesign of an emerging new reality. Our individual designs will always be contributing causes to an overall composition that is an emergent new world. (p. 201, emphasis added)

In the context of higher education, where learning designers rely heavily on collaboration with educators (mainly faculty) and/or serve as agents of change that naturally face resistance and/or opposition, it is essential for them to work from this agency-oriented position (Campbell et al., 2005, 2009)—being the guarantor of design—and claim it if they do not have it. Learning designers make important design decisions that impact learners' learning experiences and educators' practices overall (cf. Kenny et al., 2005; Rowland, 1992). Therefore, their design decisions could either maintain/protect or infringe upon students' privacy. In this case, and especially when learning designers have advanced design expertise and academic credentials, they ought to think of themselves as nonfaculty educators (Kim, 2023; Maloney & Kim, 2022) who have an equally essential and important role to play in the mission of their institution to faculty educators. Not only does this position require them to be more ethically aware of their design work and take responsibility for protecting students' privacy, but it also acknowledges learning designers as professionals who adhere to standards of practice and established methods in service of co-creating that new world.

LEARNING DESIGNER RESPONSIBILITIES: STUDENTS' PRIVACY AND DARK PATTERNS IN ONLINE COURSE DESIGN

The learning designer mindset advocated in the previous section is useful in identifying ethical situations where students' privacy is at risk and translating them into design parameters or constraints that inform the learning experiences that they design. This mindset allows learning designers to maintain students' privacy by (1) challenging ideas and assumptions regarding technology integration in general and (2) detecting what is known in UX design as “dark patterns” in online course design (Gray et al., 2021). Without subscribing to this design mindset, learning designers will have a hard time identifying ethical dilemmas and issues of students' privacy and devising solutions to ethical conundrums.

Challenging ideas and assumptions regarding technology integration

A common perception of learning designers is that of technology enthusiasts and advocates (Wiley, 2023). However, learning designers know that their role extends beyond technology integration as simplistic adoption (cf., Kumar & Ritzhaupt, 2017). Historical reviews of technological developments reveal a large space between binary options of accept or reject, characterized instead by a “shaping” or design process where various individuals or groups act on the technology. Socio-constructivist analyses of technology, such as Pinch and Bijker's (1984) framework, highlight how people with different interests, values and needs act on emerging, new and existing technologies to shape them into contextual adaptations. These adaptations can take a myriad of forms and even lead to branches of design variations. Technology is not merely adopted; it is worked into shape to align with needs, context, values, etc. These adaptations reflect human agency through the many choices made along the way, and that human agency presents opportunities for designers to incorporate students' privacy, security, dignity and other ethical considerations into designs and decisions.

Learning analytics and artificial intelligence tools similarly require work on the part of learning designers. Using professional tools and methods such as analysis, design reasoning and synthesis, learning designers do not merely make multiple-choice selections of technologies but instead engage in the work of shaping technologies through artistry, effort and tools (hard and soft). To do that work, learning designers use tools and methods such as instructional design models, learning theories and sciences, and specific tools such as learner analysis, context analysis, strategy selection, assessment design and many others. Although ethical dimensions of designers' work are not represented in learning design models and processes presently (Moore, 2021), considerations of privacy, well-being, data rights and other harms or benefits can be incorporated into design like any other design specification or parameter (Moore & Tillberg-Webb, 2023; Whitbeck, 1996). For example, analysis can include identifying potential harms and benefits—both specific to learning and more systemic—which can then function as design constraints or specifications that, like other design constraints, the learning designer seeks to optimize or minimize to the extent possible. Here again, the design mindset is important, as rather than assuming there is a “perfect” or “right” option, the design process can yield a range of options. The design mindset also forefronts ways in which a learning designer can adapt a tool or system—or design possible solutions around it—to address privacy and other issues that arise in practice.

One example of this is Scholes' (2016) approach to learning analytics. Scholes started by querying how students are treated in an institution's vision for learning analytics: are they agents who should have opportunities to exercise their agency in their learning and with their data? This activity of articulating a vision helps to frame learning analytics to include ethical considerations. Using the principle of learner agency, Scholes (2016) employed ethical analysis alongside learner and needs analysis to identify various potential benefits and harms of learning analytics where some conflicting design constraints start to present themselves. Potential benefits include identifying students who are at risk of failing or dropping out and adapting learning and systems to differentiate support and differentiate learning (Ifenthaler & Widanapathirana, 2014). Potential risks include “normative practices” that can lead to discrimination or marginalization, loss of individual agency and control over personal data, and degradation of personal data rights and security (Boyd & Crawford, 2012). The presence of potential harm is not an automatic “trump card” for technology evaluation, though. Scholes (2016) exemplifies this, not asking a question of whether to adopt or reject learning analytics but instead, “How can we capture this valuing of individual agency whilst employing learning analytics?” (p. 951). She then explores various ways in which designers can further support or enable learner agency.

Detecting dark patterns in online course design

Recognizing that students' privacy is an integral aspect of learning design and technology integration in higher education while also seeking to achieving it when integrating technology is more difficult than it may seem, agency in and through learning design is essential to understanding dark patterns in online learning systems and meaningfully and pragmatically address them. Dark patterns describe “instances where design choices subvert, impair, or distort the ability of a user to make autonomous and informed choices in relation to digital systems regardless of the designer's intent” (Gray et al., 2023). While this concept first arose in User Experience (UX) design, this label for unethical design practices has since been used broadly by designers, human-computer interaction scholars, regulators and legal professionals to indicate deceptive, manipulative, or coercive user experiences that favour shareholder value over end-user value (Gray et al., 2018, 2021).

Instances of dark patterns are present in digital experiences in a broad sense but have not been specifically studied in online learning experiences. However, the surveillance qualities that define the modern web (cf., Zuboff, 2019) provide numerous opportunities for students' choices to be subverted or for their actions to be forced—and indeed, dark patterns have been shown to impact technology addiction (Chaudhary et al., 2022; Narayanan et al., 2020) and the developmental behaviours of children (Fitton et al., 2021). Frighteningly, as noted in Gray and Boling (2016;2018) and later in Gray (in press), the issues of autonomy and agency have rarely been addressed as a central concern in learning design—and indeed, learners/students often have little control over what data are collected about them, how these data are stored, or what impact these data have over their learning experience or assessments. Their privacy remains at risk as long as learning designers do not address it in their learning design work. These issues of privacy and security were amplified during the pandemic, with surveillance technologies such as facial recognition and other computer vision techniques commonly forced onto students learning in their own homes without their meaningful consent (Gray, in press).

Employing the language of dark patterns, learning designers should consider how learner agency and autonomy are considered in online experiences. Using existing typologies of dark patterns as a point of departure, learning designers in higher education may evaluate the potential presence of patterns that subvert autonomy—for instance, through obstruction of user choice, interface interference to steer users towards outcomes desired by the learning designer, social engineering to use social pressure to manipulate user choice, or even forced action to require users to engage in certain functionality (Gray et al., 2018, 2023). As an example of surveillance, learning designers might consider, for instance, how forced data capture might require a learner to provide a visual scan of their living space or download intrusive software on their computer—representing a loss of user autonomy and agency. Similarly, learning designers might seek to identify whether choice architectures (ie, which choices are presented and in what visual or interactive form) are presented to users and to what degree they are privacy or autonomy-preserving. While creating a structure for a learning experience as a form of scaffolding can be legitimate, a learning designer should consider whether and how they are valuing and providing room for learners to exercise autonomy over their learning experience, particularly when it relates to issues such as privacy and wellbeing. Even further, using dark patterns as a rhetorical lens also enables learning designers to more broadly consider the balance of learner value and platform value—which is an inherently ethical act.

A PRACTICAL DESIGNERLY TOOL FOR THE ETHICALLY DRIVEN DESIGNER

Design ethics is often discussed in abstract ways with principles that are difficult to apply, creating the likelihood that learning designers—including those with and without formal training in the profession—might struggle in facing ethical dilemmas and problems surrounding students' privacy. The lack of guidance and practical tools on design ethics in learning design is the aetiology of this problem (Moore & Ellsworth, 2014). There are several new and emerging tools—in the abstract sense of the word—that learning designers can use to make their work more ethically driven (cf., Moore & Tillberg-Webb, 2023). Recently, Chivukula et al. (2022) built a collection of ethics-focused methods that designers can employ in their daily practice.1 Professional learning designers in higher education, similar to other designers in other design disciplines, need to be able to identify the ethical dimensions of every project, situation, or context they are operating in and be able to decide how they apply their ethical lenses to their design work, especially when students' privacy is at risk. One of the most practical and designerly (Stolterman et al., 2009) tools that learning designers in higher education can use is the Framework for Ethical Decision-Making by the Markkula Center for Applied Ethics (2021) at Santa Clara University. This framework includes a five-step process where designers ask questions in a reflective manner and, based on the answers to these actions, act in an ethical manner.

Firstly, this framework details six (6) different lenses that help designers perceive ethical dimensions (Table 1). We revisit these lenses and dimensions as each helps a learning designer determine what standards of behaviour in their design practice are to be considered right and good, especially the Right and Care lenses as they relate to students' privacy. Each dimension is elaborated through an example drawn from learning design practice (Table 2).

TABLE 2. The six ethical lenses in the framework for ethical decision-making by the Markkula Center for Applied Ethics at Santa Clara University.
Lens Example of application
Rights Students can decline to use a proctoring software or download a tool that invades their privacy
Justice All learners have to submit an assignment by a certain deadline, for example, but one or more students get an extension because they have a documented learning disability, and that is fair for them and other students
Utilitarian Providing closed captions and screen-reader-friendly documents not only to students that request them; offering them to all would produce the greatest good—accessible media is not only good for students with disabilities
Common good Online students (who do not come to campus) need to comply with the university vaccination policy
Virtue Using ChatGPT for assignments is considered academic misconduct
Care A learning designer refuses to embed a third-party tool in the learning experiences as it harms students' privacy and well-being

It is important to acknowledge that there is no wide consensus on the content of these lenses. In fact, there is no wide agreement on the same set of human and civil rights as evident in nowadays policies and politics. Humans have historically disagreed—to the extent of waging wars against each other—on what could be the common good. The different lenses may lead to different answers to the question, “What is ethical?” Thus, each lens gives the learning designer important insights into the process of deciding what is ethical in a particular circumstance.

Secondly, the framework includes a five (5) step process: (1) identifying or recognizing the ethical issues; (2) getting the facts; (3) evaluating alternative actions; (4) making a decision and testing it; (5) acting and reflecting on the outcome (Figure 1).

Details are in the caption following the image
The five steps in the framework for ethical decision-making by the Markkula Center for Applied Ethics at Santa Clara University.

These five steps could be enacted at any time throughout the design process, and at any time, the learning designer feels that they are facing an ethical dilemma or an issue of students' privacy. Yet, it might be helpful to rely on the framework earlier in the problem-framing stage of the design process. Ethical reflection can start when designers are making “IF” statements and by considering whether their choices might cause any harm. In the next section, we demonstrate how to use the five-step process through authentic examples—in the form of vignettes. These vignettes are examples of ethical issues for learning designers in higher education where students' privacy is a risk.

VIGNETTES OF LEARNING DESIGNERS IN HIGHER EDUCATION: ETHICAL ISSUES AND THE USE OF THE FRAMEWORK FOR ETHICAL DECISION-MAKING

Through the vignettes described below, we demonstrate methods for learning design professionals in higher education to detect situations where students' privacy is at risk (or potentially violated) and how they could make stellar design decisions in service to the students they design for and not to the priorities of other design stakeholders.

Vignette 1: Are they really paying attention?

Samer, an experienced learning designer at a public University with a Senior Instructional Designer job title, is working with Dr. James, a computer science professor, to redesign an introductory Python course to a hybrid format. Dr. James wants to use a new web conferencing tool that tracks students' eye movements to measure their engagement. He bought a licence for this tool, and he wants to use it because he was not satisfied with his teaching experience during the COVID-19 lockdown, saying: “I really saw how most students were either not paying attention to me while teaching them and how some turned off their cameras. It is obvious that they were not paying attention to me, and that's not good learning. I want to require in my course syllabus that all students must turn their cameras on, or they will be marked as absent from class.” Samer is concerned about the implications of this tool, but Dr. James believes it will help him to improve his teaching. Samer knows from previous research studies that tracking eye movements could provide useful insights into student engagement with the materials they interact with on screens. However, with this tool, there are many concerns about privacy, such as a potential risk for invasion of one's personal space or poor detection outcomes for neurodivergent learners. He understands the need to balance the benefits of the tool with the ethical implications of its use.

Design ethics in practice: Are they really paying attention?

Following the framework presented earlier, Samer could apply the right lens to this ethical dilemma—humans have dignity based on their human nature and their ability to choose freely. Alternatively or additionally, Samer could apply the care lens—Relationships and the need for one to listen and respond to individuals in their specific circumstances rather than merely following rules or calculating utility […]. Once one or both of these lenses is/are applied, Samer is able to frame the ethical situation as “If the faculty uses this tool, there will be an increased risk for violating students' privacy. Students' privacy is sacred and must not be violated.”

Therefore, following the first step of the framework—identifying or recognizing the ethical issues— Samer can arrive at the following answers based on the questions that ought to be asked in the first step, outlined in Table 3.

TABLE 3. Questions and answers for the first step of the framework: Identifying or recognizing the ethical issues.
Question Answer
Could this [framing] decision or situation be damaging to someone or to some group, or unevenly beneficial to people? Using this tool could penalize students with low bandwidth or camera shyness. It could lead to uneven grading
Does this decision [framing] involve a choice between a good and bad alternative, or perhaps between two “goods” or between two “bads”? The choice can be made between using this tool and the university-supported web conferencing tool. The latter requires faculty to ensure engagement through other pedagogical means
Is this issue [identified in the framing stage] about more than solely what is legal or what is most efficient? If so, how? The use of this tool could be illegal based on the university's IT policy. It could not guarantee engagement; rather, the opposite. It is putting students' rights to privacy at risk

In order to gather further facts about this issue—the second step of the framework—Samer could arrive at the following answers outlined in Table 4.

TABLE 4. Questions and answers for the second step of the framework: Getting the facts.
Question Answer
What are the relevant facts of the case? What facts are not known? Can I learn more about the situation? Do I know enough to make a decision? The unlicensed tool by the University is a security risk to students' and faculty's data. It is unclear whether faculty have the right to use unlicensed tools under academic freedom. Consulting IT staff and governing policies will be wise to learn more
What individuals and groups have an important stake in the outcome? Are the concerns of some of those individuals or groups more important? Why? Both students and faculty have a stake in the outcome. While faculty concerns are legitimate, students' privacy concerns are more important because they can be addressed through alternative pedagogical means
What are the options for acting? Have all the relevant persons and groups been consulted? Have I identified creative options? The first two options are to consult IT staff and policy experts. A creative option is to encourage students to use cameras, embed engaging activities in the course design and assess student engagement

To evaluate alternative actions—the third step of the framework—Samer could now answer the questions based on the rights and/or the virtue lenses as outlined in Table 5.

TABLE 5. Questions and answers for the third step of the framework: Evaluating alternative actions.
Question Answer
Which option best respects the rights of all who have a stake? (The Rights Lens) Not using the tool is the best option for respecting the rights of all stakeholders, especially if cybersecurity IT staff do not endorse it
Which option leads me to act as the sort of person I want to be? (The Virtue Lens) Preventing the use of this tool and flagging it is an option that leads me to act as the sort of person I want to be

To choose an option for design action and test it—the fourth step of the framework—Samer could arrive at the following answers based on the questions outlined in Table 6.

TABLE 6. Questions and answers for the fourth step of the framework: Making a decision and testing it.
Question Answer
After an evaluation using all of these lenses, which option best addresses the situation? Not using this tool is the best option to address the situation
If I told someone I respect (or a public audience) which option I have chosen, what would they say? They would appreciate my ethically driven decision
How can my decision be implemented with the greatest care and attention to the concerns of all stakeholders? Talking to Dr. James about the ethical rationale, presenting facts, and offering pedagogical solutions are ways to implement the decision with care and attention to stakeholders

In the fifth and last step of the process—acting and reflecting on the outcome—Samer could arrive at realizing how his decision turned out, ideally protecting students' privacy while preventing the faculty from violating the law. A possible follow-up action would be bringing his lived design experience to teammates' supervisors and advocating for a clearer understanding regarding students' privacy within the community of educators at his university.

Vignette 2: Yes, this is a free e-book!

Chauncey, a former K-12 teacher with a Learning Experience Designer job title at a liberal arts college, is using a free e-book for the design of a MOOC with Dr. Watson, a professor of human psychology. The e-book platform required her to create an account, answer questions and scroll through multiple products before she could access the free e-book. The book was filled with ads, and she could not turn them off without paying for a subscription. She also started receiving marketing emails from the platform after creating her account. Dr. Watson had never seen these issues before. The platform said the ads were necessary to offer a free book: “With free platforms, there are some annoying ads, but the benefit for the learners here is greater. They get to access a free book and learn from it. Even though they can pay the platform and get a PDF copy of the book, this is a free book.” Chauncey realized at this point that she was facing an ethical issue—the platform, while it offered a free e-book, did clearly nudge learners to pay for that free book and other products.

Design ethics in practice: Yes, this is a free e-book!

Following the framework presented earlier, and similar to Samer's case, Chauncey could apply the utilitarian lens to this issue (See Table 1). Alternatively or additionally, Chauncey could apply the common good lens—The interlocking relationships of society are the basis of ethical reasoning. Once one or both of these lenses is/are applied, Chauncey is able to frame the ethical situation as “If this e-book is linked to the MOOC, learners will be forced to take actions against their best interests—buy a product they were promised for free or give up their information for marketing purposes.” Therefore, following the first step of the framework—identifying or recognizing the ethical issues—Chauncey can arrive at the following answers based on the questions outlined in Table 7.

TABLE 7. Questions and answers for the first step of the framework: Identifying or recognizing the ethical issues.
Question Answer
Could this [framing] decision or situation be damaging to someone or to some group, or unevenly beneficial to people? The e-book platform is designed to benefit the company, not the learners. Learners are nudged to give up their information or buy a book they were promised would be free
Does this decision [framing] involve a choice between a good and bad alternative, or perhaps between two “goods” or between two “bads”? The decision to use this e-book platform involves a choice between two bads: using it as it is and using it for a premium
Is this issue [identified in the framing stage] about more than solely what is legal or what is most efficient? If so, how? The issue of using the e-book platform is about more than legality and efficiency. It's about learner autonomy and not misleading them. Users consent to terms and conditions, but hosting the e-book for free is efficient

In order to gather further facts about this issue—the second step of the framework—Chauncey could arrive at the following answers outlined in Table 8.

TABLE 8. Questions and answers for the second step of the framework: Getting the facts.
Question Answer
What are the relevant facts of the case? What facts are not known? Can I learn more about the situation? Do I know enough to make a decision? The faculty does not know how the platform works. Publishers need profit to stay afloat. The faculty owns the copyright. It would be wise to talk to the authors and the librarians to learn more and make a decision
What individuals and groups have an important stake in the outcome? Are the concerns of some of those individuals or groups more important? Why? The learners and the publisher are the two groups with an important stake in the outcome. The unrepresented concerns of the students are more important
What are the options for acting? Have all the relevant persons and groups been consulted? Have I identified creative options? The primary option is to discuss with authors & librarians. A creative option is providing a PDF of the pre-print version

To evaluate alternative actions—the third step of the framework—Chauncey could answer the questions based on the utilitarian and the common good lenses as outlined in Table 9.

TABLE 9. Questions and answers for the third step of the framework: Evaluating alternative actions.
Question Answer
Which option will produce the most good and do the least harm for as many stakeholders as possible? Providing the book in a PDF format based on the pre-print version
Which option best serves the community as a whole, not just some members? In addition to the PDF, provide a link to the premium version. Give learners a choice and disclose to them what is the difference between free versus paid experiences

To choose an option for design action and test it—the fourth step of the framework—Chauncey could arrive at the following answers outlined in Table 10.

TABLE 10. Questions and answers for the fourth step of the framework: Making a decision and testing it.
Question Answer
After an evaluation using all of these lenses, which option best addresses the situation? The best options are providing the pre-print PDF, providing a link to the premium version of the book, and giving the learners a choice between the pre-print version and the premium option to buy the e-book. A disclaimer to learners about the difference between the free and the paid experiences on the platform would be essential
If I told someone I respect (or a public audience) which option I have chosen, what would they say? They would appreciate my ethically driven decision and appreciate the happy medium I created
How can my decision be implemented with the greatest care and attention to the concerns of all stakeholders? Sharing the ethical rationale with Dr. Watson and making a disclaimer to learners about the difference between the free and the paid experiences on the platform are the best ways to implement the decision with the greatest care and attention to the concerns of all stakeholders

In the fifth and last step of the process—acting and reflecting on the outcome—Chauncey could arrive at realizing how his decision turned out, ideally protecting the MOOC learners from a dark pattern in a third-party platform. A possible follow-up action would be an audit of other MOOCs that the university provides and finding out if they are relying on the same publishing platform.

CONCLUSION

A core argument of this paper is that maintaining students' privacy—as an integral aspect of learning design and technology integration in higher education—is not only a matter of policy and law but also a matter of design ethics. We supported this argument by demonstrating that learning designers in higher education, similar to faculty educators, play a vital role in maintaining students' privacy by designing learning experiences that rely on online technology integration. Therefore, like other professional designers, learning designers in higher education need to care for the humans they design for by not producing designs that infringe on their privacy (Blackmon & Major, 2023), thus, not causing harm (Kozma, 2023).

Recognizing that students' privacy is an integral aspect of learning design and technology integration in higher education, but achieving it is more difficult than it may seem, and acknowledging the shortcomings of the code of ethics from well-known professional international organizations (AECT & IBSTPI) in providing practical means to employ ethics in maintaining students' privacy, as well as acknowledging the silence of instructional design models on the topic of ethics (Moore, 2021), we addressed the complex question of how can learning designers leverage design ethics to maintain students' privacy? We first defined design ethics by distinguishing it from other concepts, such as morality, law and religion, and outlined seven lenses that represent the basis of ethics/design ethics (Markkula Center for Applied Ethics, 2021). Then, we outlined a designer mindset (Boling et al., 2022; Dorst, 2011) as a foundation for the ethically driven learning designer. This mindset calls upon learning designers in higher education to assume a higher responsibility (Nelson & Stolterman, 2014) for their design work and work from an agency-oriented position (Campbell et al., 2005, 2009) to maintain students' privacy by (1) challenging ideas and assumptions regarding technology integration in general (Moore & Tillberg-Webb, 2023) and (2) detecting what is known in UX design as “dark patterns” in online course design (Gray et al., 2021). We then described a design ethics framework and illustrated through vignettes practical means for learning designers to engage in ethical design practice.

This ethically driven design practice requires the designer to be aware of the values they are incorporating into their design work and the ways they are valuing students' privacy and well-being. Our articulation of how ethics emerges in learning design practice points towards numerous areas for future research and support. First, while we have introduced a framework to support design action, more tools (and tool elaboration) are needed to address different sets of values, particular contexts and technologies, and emergent threats to learner well-being and privacy. Second, the concept of “dark patterns” may be useful in considering ethically problematic practices in learning design, pointing towards the need to audit and describe common types of design implementations that reduce learner agency or autonomy—particularly among vulnerable groups or in relation to threats towards privacy or wellbeing. Third, the vignettes we have described represent common tensions in practice, but other ways of “wrangling” design complexity could be considered more thoroughly (cf., Chivukula et al., 2023), including utilizing ethical dilemmas, tensions and situations to explore more ethically centred ways of practicing design.

Across these areas of potential future research, our goal is for learning designers to be positioned as powerful agents of the design process—integrating the needs of learners and other stakeholders in their design decisions matter and considering how beneficial and impactful their final product (eg, a course, training, or a program) will be for their target audience. Acknowledging this responsibility, and providing a range of supportive resources, will encourage learning designers to operate with a more ethics-centred mindset when designing.

FUNDING INFORMATION

This scholarly work did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

CONFLICT OF INTEREST STATEMENT

The authors have no conflicts of interest to declare.

ETHICS STATEMENT

The authors have no ethical disclosures to report.

Endnote

  • 1 See https://everydayethics.uxp2.com/methods.
  • DATA AVAILABILITY STATEMENT

    Data sharing not applicable to this article as no datasets were generated or analysed during the current study.