This will be one of a few reflections on EduCon 2.8, a conference and conversation in Philadelphia January 29 to 31.
For the first conversation Saturday, I listened to Fran Newberg, Jim Siegl, Jeff Graham, Bill Fitzgerald present Four Perspectives on Navigating the Student Privacy Landscape. Increasingly, schools, teachers, administrators, families, students, and legislative bodies struggle to define policies and practices to keep student safe online. What comes of these conversations is often a narrative of fear.
In this morality play, our main character Student travels through the Internet, leaving breadcrumbs in the form of data and content.
Trailing behind Student are the evil actors Company and Anonymity who pick up the breadcrumbs, which they magically transform into both traps to ensnare Student and revenue for Company. Privacy at times offers an invisibility cloak, but Student rarely takes this protective garment. Consent, too, intervenes by illuminating forks in the road: Terms of Agreement. One road marked “I Agree” is dimly lit but offers an easy path, and the other, “I Don’t Agree,” clearly shows obstacles and rough terrain. Student frequently chooses “I Agree,” and easily falls into hidden traps set by both Company and Anonymity. Weeping, Student calls out to Teacher, Parent, and Government to deliver him. These three cast a spell, FERPA, which plucks Student from the grasps of Company and Anonymity, and sets Student safely outside the Internet’s Gate.
If this is the Fear Narrative, what narratives can we tell, and should we tell, to counter this? After yesterday’s conversation, I think the story is more Post Modern than Morality, and the message will be a dialectic driven by questions, guides, and best-practices.
The session began a list of motives: Why were we there, and what did we hope to learn?
Some came to better understand privacy, so they could improve their local schools’ policies, while others hoped to learn ways to inform teachers. Themes emerged around best practices, transparency, open educational resources, and ethical sharing of student content and data, and my favorite, Google, or “The G Word.” The overarching question seemed to be, “How do we teach teachers about student privacy and use educational technology while empowering student creativity safely?”
- Is it private? What options and privacy settings exist?
- Does it use effective and sophisticated encryption?
- Do teachers, schools, and students keep the content? After the account is deleted, where does the information go?
- Is the data and content mined?
- Is the data and content aggregated?
This process can be overwhelming for school systems and teachers. Luckily, there is some help and guidance. First, the Houston Independent School District’s Educational Technology department offers the PSS rubric for evaluating educational web tools and apps. Second, Common Sense is working on building a knowledge base for schools to use. Look for details from Jeff Graham and Bill Fitzgerald.
Teachers also need to know federal and state laws regarding student privacy and data. With FERPA, COPPA, PPRA, HIPPA, CIPA, eRate, and as many as 158 state bills, this is no easy task. Especially, when there are loopholes.
AUPs and Balance
Beyond the teachers, schools can help frame the narrative around trust instead of fear. If schools decide to ban all forms of technology in the classroom, then parents and families will distrust technology and its educational role. Especially if these families do not regularly use these tools at home or in their jobs. Schools can help inform parents about their goals for students and how technology will be incorporated into achieve those goals. The tools, their applications, and the school’s goals should be transparent and easily understood. Families should be able to opt out, and schools should provide a list of alternative applications and software for parents, so they can make informed decisions.
However, technology is not and should never be the goal. Thoughtless adoption of educational technology is just as egregious as rigid ban. AUPs should be allowed to grow and change depending on the schools’ and their stakeholders’ needs.
Going Beyond Policy
Around mid-session, the conversation shifted from best practices to a general discussion of risk management, because ultimately, a school’s or teacher’s decision to use technology and to create policies around its use is an act of risk management. What is the risk for opting in verses the reward? Does the reward justify the risk? And finally, what are the real risks?
Data Collection: What is it and how do you expose it?
To illustrate some of the risks, they suggested this activity, which partner well with any Digital Citizenship unit or dystopian literature unit.
First, use Mozilla’s Firefox, and get a cookie reader plug-in. For this experiment, I picked Lightbeam for Firefox, which visualizes the relationships between the sites you visit and third-parties.
Second, download Privacy Badger from the Electronic Frontier Foundation. This will allow you to track the third-party trackers, and allow you to block them.
Then, with those tools, do some simple Internet surfing, like browsing for books on Amazon, checking Facebook, and maybe checking to see if you have strep throat on WebMD. Just one search in WebMD connected me to 18 trackers.
Finally, consider how ads and content follow you online. In my online 8th grade Language Arts course, we’re reading The Giver, and this activity pairs well with how the Elders supervise the community.
Back to Definitions and Students’ Perspectives
Teachers and students must be aware of the risks, rewards, and options when using technology, but they must also be speaking the same language. What “content” means to a teacher may not mean the same to one student or his/her lab partner. The same with “privacy” and “consent.” Open conversations, as opposed to outright acceptance or bans, help create common definitions understood by all stakeholders.
Education Technology Needs to Step Up
Teachers, schools, and governments should not be alone in making technology available safe and available for students. Educational technology companies must improve their practices. Common Sense Media is working on initiatives to improve transparency and accountability. For example, they are researching the readability of terms of agreement, and creating a database which will display the Flesch-Kincaid grade level, word count, and average words per sentence. This list will also include a “Shouting Index” which shows the percent of capital letters verses lower-case letters. Nothing in legislation requires companies to use so many phrases in capital letters, yet many do, despite that this makes it more difficult to read and understand the text. Some phrases, sentences, and paragraphs are copied from other policies. The presenters suggested finding a seven to ten word phrase in the third or fourth paragraph and doing a Google search to see how many times that phrase appeared online and for which companies. Such results will be a red flag for educators.
Why this matters now, especially for Georgia
On January 22, the Georgia Senate read and referred SB 281 that, if passed, will “require schools to provide certain information to students and parents prior to using any digital-learning platform.” Last week my eLearning Environments course at GSU read, analyzed, and evaluated this bill, and our general consensus was this bill would put too many demands on teachers and schools. What kinds of demands? Although the summary says “certain information” and it seems mild, the bill includes
Such explanation shall include an understandable description
38 (1) How the platform works and its principal purpose or purposes;
39 (2) The title and business address of the school official who is responsible for the
40 platform and the name and business address of any contractor or other outside party
41 maintaining the platform for or on behalf of the school;
42 (3) The information the software is designed to collect from or capture and record about
43 the student, including any data matches with other personally identifiable information;
44 (4) Every element of data that the platform or software will collect or record about the
45 student, including any personal psychological characteristics; noncognitive attributes or
46 skills, such as collaboration, resilience, and perseverance; and physiological
48 (5) The purpose of collecting and recording such data;
49 (6) Every contemplated use or disclosure of such data, the categories of recipients, and
50 the purpose of such use or disclosure;
53 (8) The policies and practices of the school regarding storage, retrievability, access
54 controls, retention, and disposal of the records collected or recorded by the platform.
Such legislation does not provide feasible solutions for our complex needs and uses of educational technology. I believe it comes from an earnest desire to protect students, as written, it puts unrealistic demands on teachers to delineate all uses of all technology in a language parents will be able to understand. Such measures add to the narratives of fear. Instead, teachers and schools must change the narrative from simplistic fairy tale with identifiable enemies into a complex dialogue between stakeholders.
Click here for the Google Slides presentation.
Click here for a video of the discussion.