Morgan G. Ames Publications Research Teaching Contact CV (pdf)
Moral Visions of Silicon Valley
Algorithms in Culture
Countercultures of Technology Use
Technological Charisma
Older Projects
Research
"I have never let my schooling interfere with my education."
                                                      - Mark Twain

Current Research Projects:


Image credit: Wikimedia Commons
Seeing Like a Valley: the Moral Visions of Silicon Valley (2017-present). Industrial practices in Silicon Valley provide more than technological products to the world. They change the way we work and play, how we value ourselves and others. They shape not just how we live, but how we think we ought to live. In short, they provide moral vision. How can we describe the moral visions of this powerful region? How do they relate to past and current alternatives, and how do they interact with contradictory and complementary visions around the world? Do they enable industry leaders to see and respond to those in Silicon Valley and beyond? Do they recognize the daily experience of growing inequalities and entrenched forms of discrimination? Whose experiences are seen and whose are unrecognizable by life in silico? Digitized, on-demanded, automated, machine learned, self-surveilled, self-driven – these are just a few of the practices that embed values and distribute resources towards some and not others. New systems of value emerge with rapid cultural shifts in how we eat and move, live and love, see and be seen.

Seeing Like a Valley seeks to bring together scholars, policy makers, technologists, journalists, media creators, artists, and activists from across U.C. Berkeley and from nearby institutions, Silicon Valley, and around the world. We seek to understand the role the Valley plays in shaping not just new technologies but moral visions, and how moral visions are ‘exported’ alongside technologies around the world. It will explore how these visions help and hinder abilities to see and respond to today’s pressing issues and problems, including growing inequalities and entrenched forms of discrimination, political polarization, declining trust in institutions, and changing labor practices. Together, we will investigate the world these moral visions are trying to produce, and the divergence between them and the local realities of the San Francisco Bay Area.

Framing our analysis is anthropologist James Scott’s book Seeing Like a State, which describes how modern states are good at knowing in a specific yet quite narrow manner: they regularize, rationalize, and codify the world. As attempts at legibility, these practices often fail a state’s subjects: map-making and census-taking, for instance, do not encompass the messy realities of life on the ground. Scott argues that the particular ways that states ‘see’ then lead to weak civil societies and authoritarian governance. These concepts about statehood are also useful for studying other regimes of power, and in particular, the industrial region known as Silicon Valley. We recognize that the very use of the term ‘Silicon Valley’ is political. It was first applied to post-WWII semiconductor companies in Santa Clara, Sunnyvale, and San Jose, whose sprawling suburban campuses, clean rooms, and superfund refuse sites displaced the fruit orchards that had blanketed the region in the century before. Since then, ‘Silicon Valley’ has come to occupy a powerful place across the San Francisco Bay Area and around the world, both materially and symbolically. Animated by massive tides of capital and global flows of employees and knowledges, its success is the baseline by which other technology hubs around the world are judged, as indicated by the proliferation of ‘silicon’ nicknames: Silicon Alley, Silicon Beach, Silicon Border, and dozens more. Like modern states, Silicon Valley attempts to regularize, rationalize, and codify the world to make it legible to machines and technological practices. Often, this ‘valley vision’ willfully ignores local practices and historical precedent in favor of universalism, institutional ‘disruption’ and beliefs of exceptionalism. It is our aim to examine the ethical and value-laden judgments these moral visions imply, who they leave out, and what they mean for our own commitments to social justice.



Image credit: Mia Steinkirch
Algorithms in Culture (2016-present). A foundational concept in computer science, algorithms – loosely defined as a set of rules to direct the behavior of machines or humans – have shaped infrastructures, practices, and daily lives around the world. We explore the implications of their development and deployment in politics, media, science, organizations, culture, and the construction of the self. This understanding has become ever more pressing in both academia and public discourse. This research area contributes to the burgeoning field of "algorithm studies" that takes algorithms as an object of cultural inquiry from a social scientific and humanistic perspective. Though not an exhaustive list, these questions guide our inquiry:

The multiple definitions and histories of algorithms: The term 'algorithm' predates the digital computer by over a thousand years, with an etymology traceable to the Islamic scholar al-Khwarizmi. How broadly might we usefully define the term today? Are contemporary algorithms a necessarily computational phenomenon? What does the explosion of discourse about ‘algorithms’ in popular culture in the last decade mean?

Algorithms as more than computation: What does it mean to study algorithms as myth, narrative, ideology, discourse, or power? In what ways can these approaches contribute back to concepts and questions within computer science, data science, and big data initiatives?

Algorithms as specifically computational: What kinds of applications and activities are now possible given certain developments in computational infrastructure and theories of computation, such as big data, deep neural networks, distributed computing, or ‘microwork’? What are the social and theoretical implications of these developments?

The practices and materialities of algorithms: Just as many in the interdisciplinary field of science studies advocate for a focus on the local practices and material artifacts which produce and sustain scientific knowledge, what kinds of work is done to make algorithms computable, and what are the material effects of algorithms?

Living with algorithms, quantifying the self: Algorithms pervade daily life and we experience their reach and impact almost anywhere, not just while working at a computer. How can we better understand how far-flung domains are being reshaped by algorithms? What are the consequences of big data and the quantified self in the everyday and in civic life?

Selected output: "Algorithms in Culture," a special issue of Big Data and Society.



Image credit: NYT
Countercultures of Technology Use (2008-present). This project explores countercultures of technology use, especially among youth and families, which have enriched our understandings of screen time, multitasking, technology in learning, utopianism in education reform, the digital divide, and non-use/techno-resistance. Across the numerous disciplines that investigate the relationships between humans and technology, most research focuses on technology use and technology users. User studies, usability metrics, techno-determinism, socio-determinism, user-centered design, participatory design, end user programming, user appropriation, technology adoption, diffusion of innovations, technology and development: such research overwhelmingly focuses on people using technology.

This project seeks to problematize this focus by turning our attention to those who are left out of these use-centric models. Focusing explicitly on the limits, refusals, and countercultures of technology use can function as a dialectic maneuver, an inversion that provides a novel perspective on, and potentially fuller understanding of, the complex, multifaceted relations among society and technology. We seek to explore these relationships and their broad ramifications rather than simply to transform non-users into users. Studying countercultures of technology use in its many forms can highlight sociotechnical configurations that may be overlooked due to a singular focus on technology use.

Selected output:
"Technology Non-Use," a special issue of First Monday.
"'Connected Learning' and the Equity Agenda: A Microsociology of Minecraft Play." Winner, Best of CSCW 2017.
"Managing Mobile Multitasking: The Culture of iPhones on Stanford Campus." Nominee, Best of CSCW 2013.
"Making Love in the Network Closet: The Benefits and Work of Family Videochat." Winner, Best of CSCW 2010.


Technological Charisma: The Social Meanings of One Laptop Per Child (2007-2019). Informing the projects on countercultures of technology use and the moral visions on the technology world, this research details the history, results, and legacy of the One Laptop per Child project. Developing a theory of technological charisma, I examine the cultural history of OLPC, its founders, and the MIT Media Lab to reveal why the project was charismatic to so many. I found that developers’ imaginaries of the ‘hacker’ children who were meant to become obsessed with the laptop reflected American cultural ideals and their own privileged and idiosyncratic middle-class childhoods more than the childhoods of their intended beneficiaries in the Global South. I then provide a detailed account of the day-to-day use of OLPC’s laptops in a well-regarded project of 10,000 laptops in Paraguay, based on seven months of ethnographic observations, 154 interviews, and quantitative data analysis of test scores and breakage logs, conducted in 2010 and 2013. I observed that children there showed more agency in either rejecting the laptop or retrofitting it for media consumption than those concerned with the project’s cultural imperialism thought – but that their use was instead shaped by transnational media corporations. I situate my observations in the history of Latin American hacker culture to show why it was especially charismatic there and what cultural forces shape its future. My results show how technologies like OLPC’s laptop become ‘charismatic’ and what the consequences of charisma can be. Reaching fifty years into the past and across the globe, I provide insights on utopian design, technology’s mythologized role in childhood, and the fine line between education and entertainment.

Selected output: The Charisma Machine: The Life, Death, and Legacy of One Laptop per Child. MIT Press, 2019.


Past Research Projects:

The Social Uses of Personal Photographs (2004-2007). The goal of this study was to understand the social uses of personal photography, including emerging cameraphone practices (prior to the release of the iPhone) as well as practices with other photographic devices. A secondary goal is to develop and refine methods for understanding the uses of – and resistance to – emerging technologies, based in social science methods and understandings, particularly STS and cultural studies.

The Technology and Poverty project (2004). In collaboration with Stanford D-school, Digital Vision Fellows, and Ricoh Innovations, this project aimed to understand the ways that economically disadvantaged communities in the United States and beyond interact with social service and governmental organizations.

Remote Usability Methods (2003). Funded by Distributed Mentor Project, with the UrbanSim group at the University of Washington, this project investigated whether remote usability studies can acheive the same results as local studies, and if not, what is lost and why. Though remote usability studies are common enough in the field, little published research has been done to explore how their results compare to local usability studies.

Ambient Displays (2001-2003). Funded by CRA CREW, Intel Undergraduate Research, and David Scholar programs at UC Berkeley, this project designed and evaluated ambient displays to make residents' conceptions of the health of their city more visible and concrete. Ambient Displays are ubiquitous computing devices - often embedded in interesting artistic objects or everyday artifacts - that provide a constant stream of peripheral information.


Contact: webmaster@morganya.org