Technology & Society - Bare Essentials

Introduction

Technology is any tool, system, or method that extends human capability. A wheelchair extends mobility. A telescope extends vision. A smartphone extends communication, memory, and access to information. Writing extends our ability to preserve and share knowledge across time and space. Technology has always been central to human potential—from fire and agriculture to antibiotics and the internet.

Technology amplifies what humans can do, both individually and collectively. It helps us overcome barriers: physical disabilities, distance, ignorance, resource scarcity. But technology also amplifies our vulnerabilities. The same smartphone that connects you to loved ones can track your location, manipulate your attention, and expose your private information. The same social media that enables global movements can spread misinformation and enable surveillance. Nuclear power can provide clean energy or devastating weapons.

Understanding technology—how it works, what trade-offs it involves, who benefits and who pays the costs—is essential for autonomy and informed decision-making in the modern world. Technology literacy isn’t about becoming an engineer or activist; it’s about making informed choices about the tools you use, the systems you participate in, and the future you help create.

In this topic, you’ll learn: - How to evaluate any technology’s benefits, risks, and trade-offs - The difference between security and privacy, and why both matter - How digital systems collect and use your information - Practical steps to protect your autonomy and make informed choices - How technology intersects with other skills in this program

This connects to other topics: Technology applies principles from Science (Level 2), requires Critical Thinking (Level 2) to evaluate claims and recognize manipulation, affects how we practice Communication Skills (Level 2) and Community & Cooperation (Level 2), and demands Long-term Thinking (Level 2) to anticipate consequences. At Level 3, you’ll explore how technology shapes entire systems and institutions.


How It Helps

Understanding technology and society helps you across every area of life:

Personal autonomy and safety: You make informed decisions about what tools to use, what information to share, and what risks to accept. You protect yourself from identity theft, financial fraud, and manipulation. You understand the difference between security (protection from unauthorized access or harm) and privacy (control over what you share, even with authorized parties), and you can evaluate trade-offs between them. For example, a cloud storage service might offer strong security against hackers but weak privacy because the company can access your files or comply with government requests.

Health and accessibility: You recognize how technology overcomes barriers for people with disabilities—screen readers for the blind, hearing aids, prosthetics, communication devices for non-verbal individuals. You understand how medical technology extends life and capability, and how to evaluate health-related technologies critically. You make informed choices about health apps, wearable devices, and medical treatments.

Work and education: You adapt to new tools and systems efficiently. You evaluate software, platforms, and devices for your needs rather than following trends. You protect your professional reputation and intellectual property. You understand how automation and AI affect employment and can plan accordingly. You use technology to learn, create, and collaborate more effectively.

Civic participation: You make informed decisions about technology policy issues that affect everyone—data privacy laws, internet regulation, AI governance, environmental impacts of technology. You can evaluate complex technologies (like nuclear power, genetic engineering, or facial recognition) by examining evidence, understanding trade-offs, and considering who benefits and who bears the costs. You’re not manipulated by oversimplified claims from any political perspective.

Relationships and community: You understand how digital communication differs from face-to-face interaction and choose appropriately. You recognize how social media and messaging apps can both strengthen and damage relationships. You protect your family’s and friends’ privacy, not just your own. You build and maintain communities using technology thoughtfully, as explored in Community & Cooperation (Level 2).

Financial well-being: You avoid scams, predatory apps, and exploitative platforms. You understand how “free” services actually work (usually through data collection and advertising). You make informed choices about digital payments, online banking, and financial technology. You evaluate the true costs of subscription services and planned obsolescence.

Mental health and attention: You recognize how technology affects your cognition, emotions, and behavior—the horse in the horse-carriage-driver metaphor from Level 1. You understand attention manipulation, infinite scroll, notification design, and other tactics that hijack your focus. You make deliberate choices about screen time, app usage, and digital habits rather than being controlled by them.


Practical Guide

Understanding What Technology Is

Technology isn’t just computers and smartphones. Any tool that extends human capability is technology. A pencil extends your ability to preserve thoughts. Eyeglasses extend vision. A bicycle extends mobility. A recipe extends culinary knowledge across generations. The carriage (body), driver (mind), and horse (emotions) from Level 1 can all be extended by technology:

Technology also creates new vulnerabilities for each: - Carriage vulnerabilities: Dependence on devices (pacemakers, insulin pumps), physical tracking, biometric data collection - Driver vulnerabilities: Information overload, filter bubbles, misinformation, cognitive manipulation - Horse vulnerabilities: Attention hijacking, emotional manipulation through design, social comparison, addiction patterns

Evaluating Any Technology

When considering whether to use a technology—whether it’s a new app, a medical device, a social media platform, or a major infrastructure project—ask these questions:

1. What capability does it extend or barrier does it overcome? - What problem does it solve? - For whom does it solve this problem? - Are there people it helps more than others?

2. What are the benefits? - Convenience, efficiency, capability, access, safety, connection, knowledge? - Are these benefits immediate or long-term? - Are they individual or collective?

3. What are the costs and risks? - Financial costs (purchase, subscription, maintenance)? - Time costs (learning curve, ongoing maintenance, troubleshooting)? - Privacy costs (what data is collected, shared, or sold)? - Security risks (vulnerability to hacking, theft, or misuse)? - Health risks (physical, mental, or emotional)? - Social costs (isolation, inequality, manipulation)? - Environmental costs (manufacturing, energy use, disposal)?

4. What are the trade-offs? - What do you gain versus what do you give up? - Convenience vs. privacy? Security vs. ease of use? Innovation vs. safety? - Are you trading short-term benefits for long-term costs?

5. Who benefits and who pays? - Who profits from this technology? - Whose interests does it serve? - Who bears the risks and costs? - Are benefits and costs distributed fairly?

6. What are the alternatives? - Can you solve the same problem differently? - What are the trade-offs of those alternatives? - What happens if you don’t use this technology at all?

Example: Evaluating a “Free” Social Media Platform

Let’s apply this framework to something like Facebook, Instagram, or TikTok:

  1. Capability extended: Staying connected with friends/family, sharing experiences, discovering content, organizing events
  2. Benefits: Easy communication, entertainment, community building, business promotion, global reach
  3. Costs/risks: Time consumption, detailed data collection (location, behavior, preferences, relationships), targeted manipulation, mental health impacts (comparison, FOMO), exposure to misinformation
  4. Trade-offs: Convenience and connection vs. privacy and attention; “free” access vs. being the product
  5. Who benefits/pays: Company profits from your data and attention; advertisers gain targeting capability; you pay with data, attention, and privacy; society pays with polarization and misinformation
  6. Alternatives: Direct messaging apps with better privacy (Signal, Element), federated social networks (Mastodon), phone calls, in-person gatherings—each with their own trade-offs

This doesn’t mean “don’t use social media”—it means understand what you’re trading and make an informed choice.

Understanding Security vs. Privacy

Many people use these terms interchangeably, but they’re different:

Security means protection from unauthorized access, theft, damage, or harm. Like locking your door or car—you’re protecting against people who shouldn’t have access.

Privacy means control over what information you share and with whom, including authorized parties. Like closing your curtains—even people who could legally look (neighbors, passersby) don’t get to see inside.

Digital security examples: - Strong passwords protect against unauthorized account access - Encryption protects data from being read if intercepted - Two-factor authentication adds an extra layer of protection - Antivirus software protects against malware

Digital privacy examples: - Choosing what personal information to put on social media - Deciding whether to share your location with an app - Reading privacy policies to know what companies do with your data - Using privacy-respecting alternatives to data-hungry services

Trade-offs between security and privacy: - A cloud backup service might be very secure (encrypted, protected from hackers) but bad for privacy (the company can access your files) - Storing data only on your personal device might be more private but less secure if you lose the device or it gets damaged - Biometric authentication (fingerprint, face recognition) can be very secure but requires sharing permanent biological data that affects privacy

You often have to balance both, just as you already do in physical life: you lock your door (security) but also close curtains when changing clothes (privacy).

How Digital Systems Collect and Use Information

Most digital services collect far more information than you realize. Understanding how this works helps you make informed choices.

What gets collected: - Explicit data: What you intentionally provide (name, email, photos, posts, messages) - Behavioral data: What you click, how long you look at things, what you search for, when you’re active - Metadata: Who you communicate with, when, how often, from where - Location data: Where you are, where you go, how long you stay - Device data: What device you use, what apps you have, your contacts list - Inferred data: Predictions about your interests, political views, health conditions, financial status, relationships

How it’s used: - Targeted advertising: Showing you ads based on your behavior and inferred characteristics - Behavioral prediction: Predicting what you’ll do, buy, click, or believe - Content manipulation: Showing you content designed to keep you engaged (often outrage or anxiety-inducing) - Selling to third parties: Data brokers, marketers, insurance companies, sometimes governments - Training AI systems: Your data trains algorithms that affect millions of people

Business models to understand:

Many services are “free” because you are the product being sold—or more precisely, your attention and data are the products. Companies like Google, Facebook, and TikTok make money by: 1. Collecting detailed information about you 2. Using that information to predict and influence your behavior 3. Selling access to your attention to advertisers 4. Sharing or selling your data to third parties—data brokers who compile profiles across multiple services, insurance companies assessing risk, political campaigns targeting voters, employers screening candidates, and many others you never agreed to interact with

Understanding this business model helps you evaluate whether the trade-offs are acceptable to you.

Alternatives exist with different trade-offs: - Paid services: You pay money instead of data (like Proton Mail vs. Gmail) - Open-source/community-run: Run by volunteers or nonprofits (like Mastodon vs. Twitter) - Minimal-data services: Collect only what’s necessary (like Signal vs. WhatsApp)

Each alternative has trade-offs: paid services cost money; community-run services may have fewer features or smaller user bases; minimal-data services may be less convenient.

General Risks of Weak Privacy and Security

Understanding risks helps you decide what protections are worth the effort:

Identity theft and financial fraud: If someone gains access to your personal information, they can impersonate you, open accounts in your name, or steal your money.

Manipulation and loss of autonomy: When companies know your psychological vulnerabilities, they can manipulate your decisions—what you buy, how you vote, what you believe, how you spend your time.

Surveillance and control: Governments, employers, insurance companies, and others can use your data to monitor, judge, restrict, or control you—denying opportunities, raising prices, or limiting freedom.

Discrimination: Data can be used to discriminate in housing, employment, insurance, credit, and other areas—often in ways that are invisible to you.

Permanent records: Information shared online can persist forever, affecting future opportunities even decades later. Youthful mistakes or changing views can haunt you permanently.

Social engineering and scams: The more information available about you, the easier it is for scammers to craft convincing frauds targeting you or your loved ones.

Chilling effects: Knowing you’re being watched changes behavior—people self-censor, avoid certain topics, or don’t seek help for stigmatized issues.

Data breaches and leaks: Even if you trust a company’s intentions, you’re also trusting their competence and security. Companies get hacked (Equifax, Target, countless others), employees make mistakes that expose data, and security vulnerabilities can exist for years before discovery. Once your data is leaked, you can’t take it back. Additionally, companies get bought, sold, or change their policies—the company you trusted with your data today might not be the one holding it tomorrow.

“I Have Nothing to Hide”

A common response to privacy concerns is “I have nothing to hide, so why should I care?” This misunderstands what privacy protects:

Contexts change: Even if you have nothing to hide today, data persists. Laws change, governments change, employers change, social norms change. Something innocent now might become problematic later. Political views, health conditions, religious beliefs, or lifestyle choices that are accepted today might be stigmatized or criminalized tomorrow.

You can’t control who gets your data next: You might trust the company collecting your data, but can you trust everyone they share it with? Everyone who might hack them? The company that buys them in five years? The government that demands access? Once data exists, you lose control over it.

Privacy is about autonomy, not secrecy: Privacy isn’t about hiding bad things—it’s about controlling your own narrative and making your own choices. It’s the difference between choosing what to share with whom, versus having everything about you available to anyone who wants to look.

Everyone has something: Medical information, financial struggles, relationship issues, embarrassing moments, private conversations, things you’re working through, mistakes you’ve made. These aren’t necessarily “bad,” but they’re yours to share or not share as you choose.

Power imbalance matters: “Nothing to hide” assumes those with power will use information fairly and benevolently. History shows this isn’t always true. Privacy protects against abuse of power.

Practical Steps for Protection

You don’t need to become a cybersecurity expert. These basic practices significantly improve your security and privacy:

For security:

  1. Use strong, unique passwords: Different password for every important account; use a password manager to remember them (like Bitwarden, KeePassXC, or even browser-built-in managers)

  2. Enable two-factor authentication (2FA): Adds a second step beyond password (usually a code from your phone); protects even if password is stolen. Note: Some 2FA methods (like SMS codes) require sharing your phone number, which can affect privacy; authenticator apps (like Google Authenticator, Authy) are more private alternatives.

  3. Keep software updated: Updates often fix security vulnerabilities; enable automatic updates when possible. Note: Most people should enable automatic updates for security; advanced users may want to review updates first to avoid unwanted feature changes, but this requires more technical knowledge.

  4. Be skeptical of unexpected messages: Don’t click links or download attachments from unexpected emails, texts, or messages—even if they look legitimate. When in doubt, contact the supposed sender directly through a known method.

  5. Back up important data: Keep copies in multiple places so you don’t lose everything if a device fails or is stolen

For privacy:

  1. Read permissions before installing apps: Does a flashlight app really need access to your contacts and location? Deny unnecessary permissions.

  2. Review privacy settings: Most platforms have privacy settings you can adjust—who can see your posts, whether your location is shared, whether your data is used for advertising. Take 10 minutes to review them.

  3. Understand what “free” means: If you’re not paying for the product, ask what you’re trading instead. Sometimes that trade-off is worth it; sometimes it’s not.

  4. Consider alternatives: When privacy matters to you, look for alternatives with better privacy practices. The trade-off might be less convenience or fewer features—decide what matters more for each use case.

  5. Think before sharing: Once information is online, you often can’t take it back. Ask: “Would I be comfortable with this being public forever?”

  6. Use private/incognito browsing when appropriate: Doesn’t make you invisible, but prevents your browser from saving history and cookies for that session

You already do this in physical life: You lock your doors (security), close curtains when you want privacy, don’t share your banking PIN, check reviews before buying things, and read warning labels on products. Digital life deserves the same reasonable caution—not paranoia, just informed awareness.

Long-term Thinking and Technology

As covered in Long-term Thinking (Level 2), technology decisions have long-term consequences:

Technology lock-in: Once you invest in a technology ecosystem (all Apple devices, all Google services, all Microsoft software), switching becomes costly and difficult. Consider long-term implications before committing.

Path dependence: Early technology choices shape future options. The QWERTY keyboard layout, designed for mechanical typewriters, persists despite better alternatives because everyone already learned it.

Unintended consequences: Technologies designed to solve one problem often create new ones. Social media connected people globally but also enabled unprecedented manipulation and polarization. Antibiotics save millions but also create resistant bacteria.

Generational impacts: Technology you use today affects future generations—data collected about children, environmental impacts of manufacturing and disposal, social norms established around technology use.

Think beyond immediate convenience: How might this technology affect you in 5, 10, or 20 years? What world are we creating for future generations?


Practice Exercises

Comprehension Check

These exercises help you verify you understood the key concepts:

  1. What is technology? In your own words, define technology and give three examples that aren’t computers or smartphones.

  2. Security vs. Privacy: Explain the difference between security and privacy. Give one example of a technology that offers good security but poor privacy, and explain why.

  3. Business models: Explain how “free” social media platforms make money. What are you trading when you use these services?

  4. Evaluation framework: List the six questions to ask when evaluating any technology.

  5. Data collection: Name three types of data that apps and websites collect beyond what you explicitly provide.

Reflection Exercises

These exercises help you think about how technology affects your life:

  1. Personal technology audit: List the five technologies you use most often (apps, devices, services). For each one, briefly note: What capability does it extend? What do you trade for using it (time, money, data, attention)?

  2. “Nothing to hide” response: Have you ever thought or said “I have nothing to hide” about privacy? After reading this topic, how would you respond to that statement now? What changed in your thinking, if anything?

  3. Trust and control: Think of a service you use that collects your data. Do you trust the company that runs it? Do you trust everyone they might share your data with? Everyone who might acquire the company in the future? How does thinking about this change your comfort level?

  4. Trade-offs you’ve made: Identify one technology where you consciously chose convenience over privacy, and one where you chose privacy over convenience. What factors influenced each decision?

  5. Long-term thinking: Choose one technology you use regularly. Imagine you’re still using it in 10 years. What might change? What risks might emerge that don’t exist today?

Application Exercises

These exercises help you practice the skills in real situations:

  1. Evaluate a technology: Choose a technology you’re considering using (a new app, device, service, or platform). Apply the six evaluation questions from this topic. Based on your analysis, what’s your decision and why?

  2. Privacy settings review: Choose one platform or service you use regularly. Spend 10 minutes exploring its privacy settings. What options are available? What surprised you? Make at least one change that better reflects your preferences.

  3. Read a privacy policy: Pick an app or service you use. Find and skim its privacy policy (don’t worry about reading every word). Identify: What data do they collect? Who do they share it with? What are your options? Summarize in 3-5 sentences what you learned.

  4. Security improvement: Implement one security practice you’re not currently doing: enable 2FA on an important account, start using a password manager, review app permissions on your phone, or set up automatic backups. Document what you did and any challenges you faced.

  5. Find an alternative: Identify one service you use that has privacy concerns. Research at least two alternatives with better privacy practices. Compare the trade-offs (features, convenience, cost, network effects). You don’t have to switch—just understand your options.

  6. Technology-free problem solving: Think of a problem you currently solve with technology. How could you solve it without that technology? What would you gain and lose? This isn’t about rejecting technology—it’s about understanding what it does for you and what alternatives exist.

Discussion Exercises

These work best with a partner or group, but can also be done as journaling prompts:

  1. Technology trade-offs debate: Choose a controversial technology (social media, facial recognition, AI assistants, genetic testing, etc.). One person argues the benefits, another argues the risks. Then switch sides. What did you learn from arguing both perspectives?

  2. Generational perspectives: If possible, discuss technology use with someone from a different generation (older or younger). How do their attitudes toward privacy, security, and technology differ from yours? What can you learn from each other?

  3. “Who benefits, who pays?” Choose a major technology that affects your community (ride-sharing apps, delivery services, automated checkout, surveillance cameras, etc.). Discuss: Who benefits most? Who bears the costs? Are these distributed fairly?

  4. Future scenarios: Imagine a technology that doesn’t exist yet but might in 10-20 years (brain-computer interfaces, ubiquitous AI, advanced genetic modification, etc.). As a group, apply the evaluation framework: What capabilities might it extend? What risks might it create? What trade-offs would people face?

  5. Privacy boundaries: Discuss with others: Where do you draw the line on privacy? What information are you comfortable sharing, and with whom? How do your boundaries differ from others’? What influences where people draw these lines?

  6. Technology and community: Discuss how technology has affected your community or a community you’re part of. Has it strengthened or weakened connections? Created new opportunities or new problems? What would you change if you could?


Key Sources & Further Reading

Foundational Concepts

Shoshana Zuboff, The Age of Surveillance Capitalism (2019) Comprehensive examination of how tech companies commodify personal data and behavior. Dense but essential for understanding modern business models.

Cory Doctorow, various essays and talks Technology writer and activist who explains complex tech issues accessibly. His concept of “enshittification” describes how platforms decay over quality time. Available free online at pluralistic.net

Bruce Schneier, Data and Goliath (2015) Security expert explains surveillance, data collection, and practical steps for protection. Balances technical detail with accessibility.

Langdon Winner, “Do Artifacts Have Politics?” (1980) Classic essay on how technology embeds values and power structures. Shows technology is never neutral.

Privacy and Security Practical Guides

Electronic Frontier Foundation (EFF) - Surveillance Self-Defense Free, regularly updated guides for protecting privacy and security. Covers basics to advanced techniques. Available at ssd.eff.org

Privacy Guides (privacyguides.org) Community-maintained recommendations for privacy-respecting tools and services, with explanations of trade-offs.

Consumer Reports Security Planner Step-by-step personalized security recommendations based on your situation and concerns. Free tool at securityplanner.consumerreports.org

Technology and Society

Zeynep Tufekci, Twitter and Tear Gas (2017) How social media affects social movements and collective action. Shows both empowering and limiting effects of digital technology.

Safiya Noble, Algorithms of Oppression (2018) How search engines and algorithms can reinforce discrimination and inequality. Important for understanding hidden biases in technology.

Neil Postman, Technopoly (1992) Older but still relevant examination of how technology shapes culture and thought. Accessible and thought-provoking.

Historical Perspectives

Lewis Mumford, Technics and Civilization (1934) Classic work on how technology has shaped human society throughout history. Shows technology’s role isn’t new, just accelerated.

Ursula Franklin, The Real World of Technology (1989) Physicist and peace activist examines how technologies prescribe and control. Based on accessible lecture series.

Digital Literacy and Critical Thinking

danah boyd, It’s Complicated: The Social Lives of Networked Teens (2014) Research on how young people actually use technology, challenging common assumptions. Useful for understanding digital life across generations.

Eli Pariser, The Filter Bubble (2011) How personalization algorithms create echo chambers and limit exposure to diverse perspectives. Connects to Critical Thinking (Level 2).

Accessibility and Assistive Technology

World Health Organization, World Report on Disability (2011) Comprehensive overview of disability globally, including role of assistive technology in overcoming barriers. Free PDF available from WHO.

Assistive Technology Industry Association (ATIA) - What is AT? Non-profit membership organization focused on assistive technology. Provides comprehensive, accessible information about assistive technology across all types of disabilities. Available at atia.org/home/at-resources/what-is-at/

Policy and Governance

Shoshana Zuboff, “Big Other: Surveillance Capitalism and the Prospects of an Information Civilization” (2015) Academic paper version of her book’s core argument. Free online, more concise than the full book.

European Union General Data Protection Regulation (GDPR) Landmark privacy law that influenced global standards. Reading the actual regulation teaches what rights you might have. Available at gdpr.eu

Ongoing Learning

Podcasts: - Your Undivided Attention (Center for Humane Technology) - Technology’s impact on society and attention - IRL: Online Life is Real Life (Mozilla) - Technology and everyday life - Note to Self (archived, but episodes remain valuable) - Technology and humanity

Websites and Organizations: - Electronic Frontier Foundation (eff.org) - Digital rights, privacy, free speech - Mozilla Foundation (foundation.mozilla.org) - Internet health, digital literacy - Access Now (accessnow.org) - Digital rights globally, not just Western focus - Center for Humane Technology (humanetech.com) - Technology design and ethics

News and Analysis: - Ars Technica - Technical news with depth and context - Krebs on Security - Security news and practical advice - Techdirt - Technology policy and digital rights

For Deeper Study

These topics connect to Intermediate and Advanced levels of this topic:

Cross-References to Other Topics

This topic connects throughout the Techne System:


Return to the Topic Navigation Page.