Context: The Times
A couple weekends ago, I deactivated my Facebook account. It was tough. Kind of. I was there in the beginning, so there was some nostalgic reckoning there. Like in the old days when you have to get a new phone number and are faced with so many unhinged connections.
But the “why” of Facebook has long since left me. That digital utopianism that defined the internet and the last two decades of computational boom has lessened in me.
The context is this: I’m an older, specialer millennial than my millennial family. Once I knew the specific name for my kind, but I don’t anymore. And I can’t even find it using my millenial powers of the “internet search.” Regardless, a special millennial but still a millennial. An explanation is all you really need: I’m a millennial that has grown up with the switch from mostly analog to digital. I have witnessed film, landlines, typewriters, VHS, DVDs, cassette tapes, and CDs. All obsolete now. I’m basically the elder millennial, the one that can speak up in village meetings and really account for where we have come from.
I try not to think about my generation because usually there is nothing but ire attributed to the word, “millennial.” We are snowflakes and parent basement inhabitors. We are known for infusing the world with our tech, becoming enamored with our creation, and then losing the real world in the process.
Recently, a BuzzFeed article concerning millennials came across my feed: “How Millennials Became The Burnout Generation,” by Anne Hellen Peterson. I was hesitant to note it as a worthy read. In fact, Peterson, being the same type of elder millennial that I am, must have been more hesitant to write it, even prophesying the incoming flak early in the piece.
You know of the hesitation. First of all: Buzzfeed. A place to get lost in the meaningless sphere of the internet where “Top 10” lists reside. And a place that has broken some legitimate news and employs really good writers.
This is a product of the latter. Peterson contends that millennials are strivers in an economic culture that has lost its beefiness. At the same time, we have grown up alongside massive steps in technological efficiency and have emulated these same steps in our human work habits in order to succeed. Our jobs are now accessible by a tap of an icon while we sit on our couches with our smartphones. And this pervasiveness has burned us out. With endless tasks to be done and seemingly endless time to do it, we become jaded by even the smallest tasks. Basically, we have too much choice, and we are beset by our ubiquitous access to tech and the internet and our generation’s call to arms to “make it” in this economic landscape. We have grown up in the land of drastic efficiency competing.
There is a part of me that wants to say that in the old days, typewriters and work documents were portable and ever-present enough to be in both home and workplace. I’m sure many professions still had the same expectations of doing work at home or staying at the office late before our current technological eruption. But, now, the net is much wider. There are more things to ignore and to intentionally turn off. Even just email on your phone is tantalizing to check while your in a long line at the grocery store.
When I started out as a teacher—and I’ll never forget this and how idealistic I was—I used all of my smartphones notifications. Work email and everything. I answered student email regularly, so much so that if I skipped a night, students would be anxious that their questions hadn’t been answered when they came to school the next day.
During an open house about two years into my career, I assured parents that they could email me at any time, and that I would get back to them as soon as possible. I wanted them to know that I was serious. But I remember seeing one father react by widening his eyes and making some gesture, which eludes me as to how it looked, but in it, I saw my future, “Whoa. What? Jeez. Exhaustion will come. Just wait.”
I realized this gestural prophecy much later. Now, all my notifications are off, except for the notifications that make a phone a phone. There are no social media apps on my phone. (Although I must qualify that this comes and goes. But my stoic philosophical good human part of me wants them off of the phone. All of them.) I do not mess around with new smartphone apps or things like that. I have not signed up for SnapChat.
More Context: Technological Forays
The momentum of technology was crazy when I grew up. I lived in a middle class house that could afford computers, even though my parents never really used them. We had an early Apple IIe. I remember unsuccessfully begging my parents, almost yearly, for the next new model. RAM and hard drives and modem speeds were crazy selling points for me, even though all I did on the thing was surf the internet or do school reports. (I did get into website creation when Geocities was popular, but that wasn’t too deep of a hobby.)
In my late teenage years, I switched to PCs because that’s where the games were. I took that computer, monitor and unwieldy tower and outrageously unbendable cording and all, to college. During my freshman year, I got nerdier when I built my own desktop. Then gaming became boring, and I became fascinated by a software movement surrounding Linux. Linux was great: it was free and community supported. There was no hidden agenda. And finally, when I was getting my teaching license, I used a credit card with no interest financing to buy my first laptop. This was around 2006 or 2007. Ignoring Microsoft and the computer manufacturer’s wishes, I wiped the whole hard drive clean and installed Linux.
Linux was a romantic thing for a college student. A free and open source operating system that was about community building, privacy, and affordability. I never paid for anything on Linux and had all the software I needed, even office suite software. Sure, it was nerdy in the sense that you had to be patient and able to copy and paste different instructions into a terminal. Sometimes you’d have to reinstall the whole operating system when something didn’t go right. But it was mostly solid. I hardly had to worry about buying the new expensive version of Office or anything like that. I even recorded multi-track music with Linux using a free program still around today.
Then the iPhone came out. Holy moly: how expensive! And how cool! I was envious but I held no delusions that I could afford such a thing. When I was into my first year of teaching, the third iPhone iteration came out. I had drastically more money than I was used to. The money wasn’t amazing, but it was enough to start me thinking about all the things that I owned that I had been making do with. So, I anxiously walked into the Apple Store and, sweating while I paid, bought an iPhone. Shortly after, I replaced my Linux laptop with a Macbook Pro. I felt grownup.
Meanwhile, this “free” ethos was working its way into everything. It was a given that buying computational things meant buying hardware, even if what you were buying was essentially software. I was, as I’m sure everyone was, weary of the first pieces of software that were available for download. It just seemed to cheapen the experience. To make it seem not worth it if all companies had to do was upload software on a server and wait for downloads instead of getting graphic designers and who knows who else to produce boxes, instruction manuals, and the discs the things came on. Wouldn’t such stuff be vastly cheaper? I mean, Linux was free!
During this push to buy truly hardware-less software, we all happily went to the free stuff online. Wait, Google, that great search engine that is simple and nonintrusive, is making a free email client? Sold. Forever! Facebook is free. That’s pretty sweet. Look at all this free software on the internet!
I don’t think I had this really good millennial-induced argument against our privacy being eroded. I can’t recall a very specific opinion. It was just part of the exchange. Free was far more awesome than paying. No one read the Terms of Service, and the calls for us to look at them by a new breed of tech journalists was considered a little bit too much.
Ads of America
I just got finished teaching Ray Bradbury’s Fahrenheit 451 to my 9th grade classes, and I still wonder if we aren’t already in Bradbury’s book burning fictional world. Sure, it’s easy to say that we don’t burn books. End of discussion. But the thing that gets me the most is that Bradbury had the sense to create a dystopian world in which people themselves were the ones that entropied their capacity to deeply think.
In his time, Bradbury had TV to contend with. Sure. Radio was there too. And tabloids. The ads were probably much more buyer beware than they are today in the sense that doctors were selling cigarettes in commercials.
And with that new driving culture of the 1950s, billboards were a blot on the environment. We don’t think about them now. They are normal. Which makes me think of Jon Mooallem’s discussion of the “shifting baseline syndrome” in his wonderful book, Wild Ones: “All of us adopt the natural world we encounter in childhood as our psychological baseline–an expectation of how things should be–and gauge the changes we see against the norm” (Mooallem 129). It’s hard to feel loss through generations. Those of us old enough to lament what it was like before the internet may feel a sense of loss that younger humans do not know. It is a romantic past for these young humans, just like a billboardless highway is a romantic past for pre-internet-knowing adults.
To see years of ad psychology in effect, take a spin on HGTV for a while and note the grippers the show puts on you to get through the ads. Of course, ads are no longer as overtly malicious as a doctor selling you cigarettes. But it’s one thing to have four minutes of commercials for a half hour program, and it’s another for the program to spend time keeping you locked in, building you up with music, voiceovers, and cut camera shots, to get you through that next hump of commercials even if the event that is promised doesn’t come until the next next batch of commercials.
It’s odd to think of our entertainment as the product of a symbiosis of creatives and marketers. Actual non-ad-driven patronage of the arts has been relegated to helping those who are aspiring and not those of the mainstream. Maybe some of us wonder how ads have shaped the entertainment we love. And maybe some of us wonder what the entertainment industry would be like without ads or product placements.
If you want to get highbrow, you can pick up a New Yorker or a Harper’s Magazine and see advertisements there too, even if you pay for a subscription. We are still beholden in the age of Netflix.
I, like many other millennials, skipped the landline costs and then decided to skip the TV’s cost as well. Having both cost a lot in monthly fees, approaching, if not over, $200. So when TV came to the internet, it was an easier choice than any other digital purchase. And best of all, no commercials. (Then again, one could look at it as a one-to-one switch from landline to cell phone. And a replacement of the TV payment with the internet hook-up, and then add on all the internet TV services. Maybe we pay more?)
And to be honest, I remember just despising TV. I felt locked into its time slot. And with all those advertisements, TV seemed to lack flow. You’d always be wondering at what point in a show’s tension would advertisements come into play.
America was largely a TV culture and it still is today. And even with its new shape, its new contours, TV is still the comfort food of America.
But I write of TV to bring us back to Facebook and Google.
Ads and Innovation
We joined Facebook back in the early aughts because it was new and could be the future and everyone was doing it and we were just hungry for what was to come. Would this be the new phone book? How neat!
And then we slowly wondered how Facebook was making its money. We all know that nothing is free. In the Linux world, free comes from the donated time and effort and skill of computer programmers that keep things running. Things are necessarily slow there: make it simple and make it work.
And then we have all the advertisement and data ruckus that Facebook has been involved in.
I deactivated my account partly because of Facebook’s recent negligence and partly because Facebook is no longer the simple thing I used to use. When I did log in, I would see tons of red notifications from algorithmically chosen friends telling me what they had posted. Even after turning every single notification off and doing tons of searches for how to absolutely turn all notifications off, Facebook wouldn’t let me. They held control of these mandatory notifications, the ones from supposed close friends, telling me what they really valued: my time on Facebook and not my actual time.
It’s a shame. Facebook’s Events feature was a nice thing to have. And it was nice to be able to go on it once and a while and connect with a long ago friend that now lives somewhere very different from where I live. But if we are going to navigate this digital stuff that’s becoming more pervasive, we must have principals. This brings me to Google.
I signed up for the Gmail beta program because I saw similar aesthetics to Facebook. Google was clean and simple. It didn’t flash popups or images as ads at me. It used simple and non-obtrusive text. And, one-upping Facebook, Google had a motto that I liked: “Don’t be evil.” It shows that they knew of what was to come if they were successful. I liked that. It showed massive amounts of maturity.
Not that advertisements are bad by nature. The best are the most noble: products chosen by show creators and willing advertisers. Some podcasts have done well with this, choosing the ads they feel best represent their audience and themselves. There is a sort of pride in that. The consumer is entering into a community there. One can feel good about that sort of advertising.
But the most effective advertising doesn’t have the hands in our new innovative world. This is simple word-of-mouth advertising. We trust who we know. And we love anecdotes. Word-of-mouth advertising provides a trustworthy space for both. It’s almost to a fault: if we hear one bad anecdote from a friend about a product, we may never buy that product again even if it is an objectively good product.
Swinging to the other side, the worst kind of ads are the ones that control whoever is advertising. In content terms, ads that deform or even tweak content are terrible.
But what about a large tech company that collects your data, not to sell it, but to pinpoint what ads are best for you based on those advertisers willing to pay the opportunity and the algorithms that define the opportunity. Sure, Google could end up getting some good ads in there. But Google must have strong restraints to make using their products a focus-worthy thing. We don’t drive down highways, see a billboard, and immediately get off the highway to buy whatever it is that billboard is selling. When you are doing something important, you don’t want to be messed with in terms of advertisements.
It is very noteworthy to point out that it is now more inconvenient to get away from ads than to live amongst them. We must turn off our phones and go “off-the-grid,” which may now be impossible to do in our own homes unless we are among the most saviest of technologists.
Not surprisingly, we spend a ton more on advertising than other countries. In 2015, the US spent double the amount advertising than a country with four times the population: China. And in ad market of the world, Google has kept up a steady 32% of the market share.
Even the inventor of the current web, Tim Berners-Lee, is worried about what this has all become. We have behemoth companies that control a lot of the internet’s market share: Google, Facebook, and Amazon. Berners-Lee makes a good point: we need to scrutinize the controllers of content. It’s amazing that we not only have to worry about who we sign up to provide us access to the internet, but we must be aware of sub-internets that control data.
The counterpoint to all of this worry is that this was bound to happen. TV hasn’t ruined everything. Neither has the radio. There will be some further disruption when we get tired of a privacy-worried narrative and finally do something about it. Maybe a new company that respects privacy will jump into the fray.
Then again, there are many companies that tout privacy as their cutting edge and nothing has really happened. Google has created its entire behemoth company, building smartphones, computers, an office suite, intricate navigation software, and having tons of money leftover to build a “humanitarian” project that will extend the internet using balloons. (I use quotes because of Bill Gates’s seemingly wise comments on the project.)
And maybe this is all not that big of a deal. We don’t need massive amounts of data to manipulate people. We are good at that without the internet.
I write all this but still use Google and Twitter. This essay was largely written in Google Docs. I have wondered if I should quit these many times, but the ubiquity and convenience and just sheer effort of changing has always swayed me to stay. I want to go back to those Linux days, but the effort, sheesh!
I do want less personalization in my life. Bradbury was right: the enemy of deep thinking is efficiency. We want to make sure that we can standardize industrial processes to help us build difficult products. That’s a great invention. But when it comes to entertainment and general things in life, giving yourself a bubble predicated on your history is needlessly prophetic. It puts us in a cycle that we have created, aided by algorithms.
I want to enjoy new things. I want to be able to wander heedlessly. I don’t want my past choices to always determine what I will want to do or like.
So, Facebook is gone for now. People can still reach me if they want. The communication infrastructure is definitely there. Google is used but so are adblockers. And my qualms about privacy reside in this weird artist’s garret, a small watchful area to perch and live and mull over all the technology qualms, above the multiple floors of the regular order of life, but there to shout down to myself that I told myself.