June 7 Computational Thinking

Python Coding

Today in STEM Skill we learned Python. I took a class in Python before and remember some of the basic projects we did such as

  • defining a turtle and making it graph stuff
  • using "for i in range():"
  • strings
  • printing functions
I chose the beginner Python path but already learned some of the concepts. I am reminded of how picky the syntax can be and how frustrating that can be. I got to the 3rd section on part 3/6. It helped me gain a deeper understanding of Python and coding in general. 






ChatGPT's Biases

ChatGPT can be very biased especially when it draws information from the internet or samples that don't reflect the whole population. Specifically, it can be biased against certain groups, especially women and black people. For example, it couldn't recognize famous figures like Serena Williams or Michelle Obama. Overall, ChatGPT shows many biases based on the data it is fed. It is important to consider the type of data you give AI to prevent biases


Comments

Popular posts from this blog

My Gov School Experience

June 13: Are Animals People?

June 17: Data Visualization and AI Overview