Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Can Higher Education Make Silicon Valley More Ethical?
#1
Technology is amoral. It is up to its users to resist the urge to do evil with it, whether cyber-bullying, the spreading of fake news and racist/ religiously-bigoted memes, slander or libel, plagiarism, or outright fraud and theft. Face it: evil is tempting. So it is with computing and web use. Most of us do not have the ability to code, but we certainly can compose messages -- some of which can hurt others emotionally and vocationally.

Can Higher Education Make Silicon Valley More Ethical?


By Nell Gluckman March 14, 2018

The internet and the technology companies powering it have shown their dark side recently. Racism and sexism have flourished, mostly unchecked, on social media. Algorithms used by Facebook and Twitter have been blamed for the spread of fake news. And as phones, cars, and household devices scoop up their users’ data, the expectation of privacy has practically evaporated.


Under each of those phenomena lie ethical quandaries. Is technological development outpacing our ability to tease out its implications? If so, is higher education responsible for the problem?

Jim Malazita, an assistant professor of science and technology studies at Rensselaer Polytechnic Institute, believes higher education has played a role. He thinks there’s something about how the STEM disciplines are taught — science, technology, engineering, and mathematics — that discourages students from considering ethical questions as they learn the skills they need to work for big technology companies. But if colleges and universities are contributing to the problem, then they can also help fix it.

With funding from the National Endowment for the Humanities, Malazita is piloting an initiative to inject discussions of ethics and politics into introductory computer-science courses at Rensselaer, in New York. He is pushing back against the idea that programmers should focus purely on technical work and leave the softer questions about how their products are used to social scientists. He hopes his students will see it as their job to build socially responsible technology.


Q. How is what you’re trying to do different from the way ethics and computer science are usually taught?


A. Rarely will you talk to a STEM student who says ethics aren’t important. But by the time they’re done with their education, they’re like, It’s other people around me’s job to make sure this technology is doing the right thing.

Rather than pairing computer science with a suite of courses to make computer science ethical, what if we get humanists into core computer-science classes to get students to think about the ethics and politics of computer science as part of their core skill set?
How can we teach you Python and coding, but at the same time always talk about coding as a political practice?

Q. What will that look like in your course?


A. We’re using data sets about various social issues, such as race and violence in New York City, and a Unesco database about education funding. We’re saying, Here are these data sets you’re going to have to crunch through using Python. What do these algorithms leave out? What can’t you account for?

We’re thinking through teaching how to use code and the way the code shapes the way you think about the database. Every language you learn has a bias to it, so let’s acknowledge that.

Q. What’s an example of a type of problem you might have your students solve that helps them understand their work as programmers more politically?


A. The data set about gun violence in New York City is already used by computer-science faculty in the classroom. But the way the problems are framed is: Walk through the data set, parse up where gun violence is and where it’s not. And then based on those findings, tell me where you would rather live and rather not live in New York City.

We use the data set, but with readings about gun violence. We ask what’s the problem with asking the question in this way. How can we use this data to understand the phenomenon of gun violence rather than “these parts of New York City are good and these parts are bad”?

More from the Chronicle of Higher Education.


Obviously for purpo0ses of discussion.
The ideal subject of totalitarian rule is not the convinced Nazi or the dedicated Communist  but instead the people for whom the distinction between fact and fiction, true and false, no longer exists -- Hannah Arendt.


Reply


Possibly Related Threads...
Thread Author Replies Views Last Post
  "Smart" phones make people into dumb proles X_4AD_84 8 8,018 11-28-2017, 02:15 PM
Last Post: noway2

Forum Jump:


Users browsing this thread: 1 Guest(s)