📬 The Free-Range Technologist Jan 2020 🤠


Late-Jan Edition


Discoveries and Lifehacks


Article: America's most widely consumed oil causes genetic changes in the brain. Click here to read.

"You are what you eat" is a phrase that is not only inaccurate but has probably killed a lot of people who were fooled into avoiding fats but consumed sugar and carbs instead only to die an early death. "What you eat changes you" would be a better phrase by which to live. The emerging field of epigenetics is finding all sorts of ways that our environment and foods change the expression of our genes. This study examines how consuming Soybean oil changes gene expression and might be linked to autism, Alzheimer's disease, anxiety, and depression. More reason to eat like our ancestors (who probably never encountered a soybean in their life). Click here to read.


Prediction: The Next Big Social Network is Email.

From The Interface by Casey Newton: "The next big social network is email. Newsletters are the new websites, and expect to see communities growing up around them in interesting new ways, led by companies like Substack (a platform for paid subscription newsletters). "

I agree. Crafting and sending The Free-Range Technologist is slow and somewhat inefficient, but writing a longer message allows me to work on my "slow thinking" skills that are important to making and maintaining relationships.

In a related note, this semester, I am trying a new tactic to get my students to read more current events (an obsession I inherited from Mike C.). I suggested they subscribe and read two daily email newsletters (The Morning Brew and The Missouri Business Alert). Then I am crafting all extra credit questions from the most relevant news articles from those publications (including some randomly release extra credit quizzes throughout the semester). So far, it seems to be having some effect!


Book Reviews


Book: Glow Kids by Nicholas Kardaras.

Glow Kids is a well-written book about the effects of screens on kids and teenagers, from video games and so-called educational games to social media and online porn. Well documented with lots of cases and references to current research, this book examines the causes of screen addiction from the perspective of a professional psychologist who specializes in treating digital addiction.

Just like cocaine and other drugs, some proportion of the population is highly susceptible to online addiction. Many of our online experiences are designed to deliver a dopamine release to encourage addictive behavior. Think of the infinite scrolling common to social media. The effect of this constant stimulation is not good, and the research points to a hazardous impact on the brains of children and teens whose brains are still developing.

"Our brains are simply not designed for the visual hyperstimulation with which recently developed digital technology bombards us."
-Page 18


One of the things that I re-discovered in this book (I forgot things!) was how many technologists limit the time that their kids spend on on-screen.

"..ironically, the most tech-cautious parents are the people who invented our iCulture. People are shocked to find out that tech god Steve Jobs was a low-tech parent; in 2010, when a reporter suggested that his children must love their just-released iPad, he replied: "They haven't used it. We limit how much technology our kids use at home."
-Page 31


I highly recommend this book, especially for those with young children or who are planning on having some! The author's style of writing might not appeal to everyone. He is sometimes too wordy, for example, taking time to tell you whenever he intends for something to be a pun or not to be a pun, but the book also has lots of great examples and stories that help bring you along.

Amazon Link: Click Here


Book: Behind the Screen: Content Moderation in the Shadows of Social Media by Sarah T. Roberts.

Behind the Screen is an excellent book about the content moderation industry. While the book starts with more discussion of the academic research in the area of labor than needed for a popular press book (in my option), it quickly gets into the nitty-gritty of the content moderation industry, its effect on our lives and the people who do the work.

"Rather than elevating the workers of the world, twenty-first-century configurations of labor are undergoing a globalized race to the bottom in search of ever-cheaper, faster, and more human and material resources to compete in the globalized 24/7 networked marketplace. "
-Page 72


One might assume that YouTube and Facebook employees do content moderation. However, direct employees of the technology companies do not do most content moderation. Watching and removing violent and hateful content is done by contractors or sub-contractors of contractors, many of whom are overseas. This system of contractors allows tech companies to distance themselves from any liability regarding the effect of sorting and tagging violent and sadistic videos and posts hour after hour for years on end could have on a person's mind.

Ironically, it is the existence of sites that allow for direct upload, and streaming of content incentivizes the creation of such horrible videos.

Beyond the removal of content, many content moderation companies create content for commercial interests, steering online discussions to match the corporate message (done by humans) and using bots to amplify and re-enforce the message.

I used to think that the online world was a digitized version of the physical world. This book makes it clear that the online world is a highly curated commercial world where paid-humans and ever sophisticated bots shape and distort digital reality.

I highly recommend this book to all.

Amazon Link: Click Here


Book: Chaos Monkeys: Obscene Fortune and Random Failure in Silicon Valley by Antonio Garcia Martinez.

This book is a fun story and, at times, insightful commentary about three founders, a startup, and the rise of social media. The author chronicles his adventures in Silicon Valley with great wit and enthusiasm, explaining how companies like Twitter and Facebook come into being and fight with each other to dominate.

Probably the most interesting stories center around the struggle inside Facebook to develop a profitable revenue and ad model in the run-up to the company's IPO. Facebook's leaders were filled with admirable conviction about the need to protect user privacy but threw that all out the window when they realized that billions of dollars, and the future of the company, were at stake.

I'd recommend taking a look at the book if you are interested in the world of social media and big tech (with an emphasis on the culture). However, if that is your bag, then I would strongly recommend you first check out Anna Wiener's Uncanny Valley (Amazon Link), which I am reading right now.

Amazon Link: Click Here


Book: Permanent Record by Edward Snowden.

I can't say much about this book that has not already been said about Snowden's biography. A few points:

  • He details his personal history.

  • He explains the technical concepts of his work, how he spirited secrets off the network, and security technology well.

  • The book is well-written and carried forward with good storytelling along an approximately 15-year timeline from 9/11 to his post-disclosure isolation in Moscow.

Amazon Link: Click Here


Sharing


Trying to keep Exploring-->Learning-->Building--> Sharing. I am encouraged that people are interested in learning more about how AI is weaponized on social media platforms. I have several podcast interviews or talks lined up for the start of 2020 about this topic.

This month, I shared my perspective on the dangerous uses of AI with Ozan Varol, who featured the highlights in his "Contrarian Spotlight." Ozan is a rocket scientist turned award-winning professor and author. Interestingly, Congress just started talking about the need for regulating certain forms of AI (AP article). From the interview: "Unfortunately, Facebook, Twitter, and Google are just examples of the problem: Humans using AI against other humans. We need an FDA for AI. We should be able to know when we are just browsing a page or when an AI is tracking our behaviors to drive us to some goal. And we should be able to find out what the goal is: making a purchase, supporting a candidate, etc.. At the global scale, I think that now is the time to develop an international framework for regulating all types of AI." Full article here. https://ozanvarol.com/j-scott-christianson/

BTW, I created a page with resources for those who are interested in AI and Social Media: http://learnabout.ai Let me know if there are resources that I should add!


What I Am Looking Forward To


Getting responses from you all, hearing about you have been creating recently, and the resources/lifehacks you are using!! Just hit the reply button and fill me in!

Events:

  • Discussing Adversarial Machine Learning/AI on Radio Friends with Paul Pepper on Jan 31st, 2020. TODAY!!

  • Discussing Student-Centric AI on the Enrollment Growth University podcast in late Feb.

  • Talking AI and Social Media on the Keep it Juicy podcastin Feb.

  • Talking about AI and Medicine on The InnovaBuzz Podcast in March.

  • Talking about "Living a Good Life: AI and Social Media," with the Fundamentals of Globalization and Digital Technologies class in April.

  • Potentially appearing on the Mark Struczewski podcast [] this summer discussing AI and Social Media.

  • For a full list of upcoming events, click here. And let me know if you want to attend an event, I'll get you in! Or if you want me to talk at your event. I work for coffee and scones. 😎

Thanks for reading and thanks for your insights. Was this information interesting or useful? If so, click here to forward this issue to a colleague or friend.

Take care,


jscottchristianson@mac.com

Previous Issues
*|LIST:RECENT3|*