Skip to Main Content

Artificial Intelligence

Ethical Considerations

Because generative AI is constantly rapidly evolving, so are the social, legal, and ethical impacts and effects surrounding it. This page will go over some of the concerns that have emerged. 

Bias

ChatGPT, as an example, was largely trained on the internet. We all know how awful of a place the internet can be. Human trainers are often still needed for these models as well to sort out the really ugly things it can produce. Because large language models are trained on large amounts of text that were made by humans, it stands to reason that they parrot back human bias. 

Hallucinations

ChatGPT and other AI are prone to a phenomenon called AI Hallucination, where it’ll produce either completely irrelevant or wrong answers. OpenAI directs the user to use the “thumbs down” button when that happens as a sort of reporting method. It’s also not connected to the internet, so ChatGPT can’t comment on anything beyond 2022, when its training data ends.  

Privacy

Another element of this to keep in mind is that OpenAI did not release GPT-3.5 out of the goodness of their hearts. They are using the public to further train ChatGPT; they said so in their FAQ: “Your conversations may be reviewed by our AI Trainers to improve our systems.” They now allow you to disable model training, but this means your chats are temporary. Many other generative AI tools have similar privacy terms, which can be updated at any time. 

A few questions to consider:

  • Is the data privacy policy findable?
  • Are you asked for consent to provide data or are you automatically opt-in?
  • How long is your data held (data retention)?
  • What securities are there for de-identification?
  • How are privacy risks addressed?
  • Is there a third-party company involved? If so, what access and control does this company have over the data collected?

Environmental Concerns

The environmental impact of training a model like ChatGPT is massive. The amount of computing power, as well as cooling for said computers, is huge. There was a preprint that came out on April 6th 2023 that details the cost in terms of water for AI. They said that training GPT-3 – which was the model before GPT 3.5 - can consume 700,000 liters of clean freshwater.” It also consumes a lot of energy more generally. We need to think about training models before doing it, given this astronomical cost.