Dataset Viewer
Auto-converted to Parquet Duplicate
text
stringlengths
0
3.45k
Last year, we decided to start a magazine. We felt disappointed by the mainstream conversation about technology, and we wanted to create space for an alternative. Then Trump happened. At first, nothing seemed to matter. Why publish a magazine about technology in the age of Trump? Why write or read about anything but Tr...
The first one is easy: Fuck Trump. We have to fight him, and we will. But we’re not going to let him monopolize our bandwidth. That’s what he wants. Trump is the hideous id of the internet. He will do anything for our clicks—and he wants all of our clicks. But modern operating systems have made us good multitaskers. We...
The second reason is more important: Trump is a wake-up call. His election proves that many of the wise men and women who claim to understand how our world works have no clue. He is a reminder that we can’t afford to have stupid conversations about important things anymore. Trump is a creature of technology. A technolo...
Technology will continue to serve our Troll-in-Chief, as his Administration and his allies in Congress hunt undesirables with databases and drones. Some of their barbarisms will be dramatic and highly visible. Others will be piecemeal, discreet, and hard to grasp immediately. We need intelligent writing to make both fo...
The era of Trump will be a technologized era, like the one before it. We will be paying attention—and proposing ways to resist. (Along these lines, stay tuned for Tech Against Trump, a short book coming soon from Logic.) We will also be in the streets. We hope you’ll join us. Yours, Jim Fingal Christa Hartsock Ben Tarn...
We can’t stop watching. We’re so, so bored. Tech is magic. Tech lets us build worlds and talk across oceans. Whatever kind of freak we are—and most of us are several kinds—tech helps us find other freaks like us. But most tech writing is shallow and pointless. It’s nobody’s fault; everyone is just doing their job. Comm...
We deserve a better conversation. By “we,” we mean you, because everyone uses technology. We are all both its subject and object. Tech is how you find the place you live. It’s how you turn your car into a taxi or your spare room into a hotel. Tech lets you see the faces of the people you love from thousands of miles aw...
Someday, when you do swipe right on that special someone, and they swipe right on you—it’s a match!—tech will shape how you flirt, how you define the relationship, how you plan and brag about your wedding. When you have children, you will use tech to track their development and find a babysitter. By then, maybe the bab...
The stakes are high, is what we are saying. Like you, we are both insiders and outsiders. Luckily, this is exactly the position you need to be in to observe and describe a system. In the social sciences they call it “logic”: the rules that govern how groups of people operate. For engineers, “logic” refers to the rules ...
In the beginning, my plan was perfect. I would meditate for five minutes in the morning. Each evening before bed, I would do the same. There was only one catch: instead of relying on my own feelings, a biofeedback device would study my brainwaves to tell me whether I was actually relaxed, focused, anxious, or asleep.
By placing just a few electrodes on my scalp to measure its electrical activity, I could use an electroencephalography (EEG) headset to monitor my mood. Whereas “quantified self” devices like the Fitbit or Apple Watch save your data for later, these headsets loop your brainwaves back to you in real time so that you can...
Basic EEG technology has been around since the early twentieth century, but only recently has it become available in portable, affordable, and Bluetooth-ready packages. In the last five years, several start-ups—with hopeful names like Thync, Melon, Emotiv, and Muse—have tried to bring EEG devices from their clinical an...
I first learned about the headsets after watching the Muse’s creator give a TEDx talk called “Know thyself, with a brain scanner.” “Our feelings about how we’re feeling are notoriously unreliable,” she told the audience, as blue waves and fuzzy green patches of light flickered on a screen above her. These were her own ...
Their actual meaning was indecipherable, at least to me. But they supported a sales pitch that was undeniably attractive: in the comfort of our own homes, without psychotropic meds, psychoanalysis, or an invasive operation, we could bring to light what had previously been unconscious. That was, in any case, the dream.
Day One When I first placed the Muse on my head one Sunday evening in late October, I felt as though I was greeting myself in the future. A thin black band, lightweight and plastic, stretched across my forehead. Its wing-like flanks fit snugly behind my ears. Clouds floated by on the launch screen of its accompanying i...
To encourage me not to give up before I’d even started, the assistant kept talking to me. Her calming female voice told me how to delicately re-adjust the electrodes to get the signal working. The ones behind my ears were having trouble aligning with the shape of my head. Eventually, the Muse accurately “sensed” my bra...
Tap the button, she said encouragingly. “I’m ready,” I clicked, and my first five-minute meditation session began. Inward bound, I sat at my desk with the lamp on and closed my eyes. Waves crashed loudly on the shore, which indicated that I was thinking too much. But from time to time, I could hear a few soft splashes ...
After what seemed like forever, it was over. Like all self-tracking practices (and rather unlike a typical meditation session), it seemed that the post-game was as important as the practice itself. Knowing this, I made a good-faith effort to pore over my “results.” They were, at first, second, and third glance, impenet...
Equally inscrutable were the two awards I had earned. Whatever they were, I thought, they were hardly deserved, considering I had so far spent a total of seven minutes scanning my brain. One was for tranquility—“Being more than 50% calm in a single session must feel good,” the award told me. The other was a “Birds of E...
It felt great to meditate for the first time only to be told that I was already off to a good start. But I knew deep down—or, at least, I thought I knew—that I had not felt calm during any part of the session. I was in the difficult position, then, of either accepting that I did not know myself, in spite of myself, or ...
The Brain Doctor The second morning of my experiment, I took the subway uptown to see Dr. Kamran Fallahpour, an Iranian-American psychologist in his mid-fifties and the founder of the Brain Resource Center. The Center provides patients with maps and other measures of their cognitive activity so that they can, ideally, ...
Some of Fallahpour’s patients suffer from severe brain trauma, autism, PTSD, or cognitive decline. But many have, for lack of a better word, normal-seeming brains. Athletes, opera singers, attorneys, actors, students—some of them as young as five years old—come to Fallahpour to improve their concentration, reduce stres...
Dr. Fallahpour’s offices and labs lie on the ground floor of a heavy stone apartment building on the Upper West Side. When I arrived, he was in the middle of editing a slideshow on brain plasticity for a talk he was due to give at an Alzheimer’s conference. On a second, adjacent monitor, sherbet peaks and colored waves...
Fallahpour wears bold glasses with thick-topped frames, in the style of extra eyebrows. When we met, he was dressed in a dark blue suit to which was affixed a red brooch shaped like a coral reef or a neural net—I kept meaning to ask which. An enthusiastic speaker with a warm bedside manner, it was hard to shake the imp...
His staff had not yet arrived that morning, he apologized, so he would be fielding any calls himself. As if on cue, the phone rang. “No. Unfortunately, we do not take insurance,” he told the caller. “That happens a lot,” he explained, after hanging up. “Now, where were we?” Before turning to brain stimulation technolog...
Like many of his fellow researchers, Fallahpour was interested in how to improve the brain through conditioning, electrical and magnetic stimulation, and visual feedback. He began to work with an international group of neuroscientists, clinicians, and researchers developing a database of the typical brain. They intervi...
Neuroscience has always had this double aim: to know the brain and to be able to change it. Its method for doing so—“screen and intervene”—is part of the larger trend toward personalized medicine initiatives. Advance testing, such as genomics, can target patients at risk for diabetes, cancer, and other diseases. With t...
Under the twenty-first century paradigm of personalized medicine, everyone becomes a “potential patient.” This is why the Brain Resource Center sees just as many “normal” patients as symptomatic ones. And it’s why commercial EEG headsets are being sold to both epileptics trying to monitor their symptoms and office work...
Brain training is seductive because its techniques reinforce an existing neoliberal approach: health becomes a product of personal responsibility; economic and environmental causes of illness are ignored. Genetics may hardwire us in certain ways, the logic of neuroliberalism goes, but hard work can make us healthy. One...
Consider Fallahpour’s boot camp for elementary school kids. For a few hours each day during school vacations, the small rooms of his low-ceilinged offices are swarmed with well-behaved wealthy children playing games to “improve brain health and unlock better function,” as well as to acquire a “competitive advantage.” “...
Bettering Myself The more I thought about the kind of cognitive enhancement Fallahpour promised, the more trouble I had remembering the last time I felt clear-eyed and focused. Had I ever been? Could I ever be? For a few days I had sensed a dull blankness behind my eyes. I wondered if it was a head cold, or sleep depri...
On a good day, I convinced myself, there was no way I was operating above sixty percent, maybe sixty-five. Sixty percent of what, I wasn’t sure. But I knew I could do better. I felt a twinge of envy toward those who had achieved the mythical “peak performance,” and I redoubled my commitment to self-improvement.
The headset remained subtly encouraging. “Whatever you’re experiencing right now is perfect,” my meditation assistant assured me—just moments before my fourth session’s calibration had paused, again, because the signal quality was too low. I re-adjusted my headset, practicing patience. “Training your mind is kind of li...
When I mentioned my experiments to a friend, he recommended that I watch a performance by the conceptual programmer Sam Lavigne. In “Online Shopping Center,” Lavigne trains a DIY EEG device to identify whether his brain is thinking about shopping online or his own mortality. Being either “shopping-like” or “death-like”...
Test Subject When I went to see Dr. Fallahpour for a follow-up visit, I was running thirty minutes late. I had forgotten to transfer trains at 59th Street because I had gotten distracted trying to make sense of an advertisement chastening me for my distraction: “Daydreamed through your stop, again?” it asked.
I had, but I wouldn’t know it yet. It wasn’t until 116th Street that I realized I had missed my stop—in fact, I had missed several. Still, I felt a low current of satisfaction when I emerged into the sunlight at 125th Street, far from where I needed to be. Such inattention made me a more viable patient for brain traini...
Fallahpour and I decided I would try a calm protocol first, followed by one that rewarded my brain for focus. While he gelled the electrodes and placed them on my scalp, I asked him about some of the skepticism surrounding EEG headsets—namely, the fact that many people, myself included, found it difficult to tell what ...
“EEG is a crude tool and it isn’t the best we have, but it’s the most convenient in many ways,” he explained. “It’s prone to a lot of ‘garbage in and out.’” But when done correctly, he added, it can be “useful and quite powerful.” Deciphering signals from the noise required the trained judgement of an expert like Falla...
To start, we took a baseline measurement of my brain. I had very quick recovery, or response, or something, in terms of what I think were my alpha waves. This meant that my ability to calm myself was sophisticated. I felt surprised at first, and dumbly flattered, much like I had during my first session with the Muse.
For the calm protocol, classical music cut in and out of my headphones depending on whether certain frequencies in my brain were active. This was visualized by red and blue columns flanking both sides of the screen. I was supposed to keep the colors under certain thresholds in their respective containers. At one point,...
When we tested my concentration, the settings were adjusted to exercise different kinds of brainwaves. I was tasked with keeping a blue column at a certain level while not letting other red columns reach a certain height. It was more difficult than meditating with my Muse—but also, because it was a game, more enjoyable...
Know Nothing By the end of my week with the Muse, my results were as perilously inscrutable as they had been at the start. Thousands of birds had chirped in my ear. An infinity of waves had crashed upon an endless shore. I had earned quite a few more badges, some by the sheer virtue of persisting: adjusting the signal,...
I had learned very little about myself. This in itself wasn’t surprising. But if the EEG headsets were supposed to teach anything, their lesson was somewhat contradictory: I should know myself, but I should also be prepared to be wrong about what I knew. In this respect, the headset was more like the Oracle of Delphi’...
The more I parsed my personal graphs and charts, the more I arrived at the same conclusions as anyone who has ever taken more than a passing glance at the brain. Our tools aren’t good enough. At least not yet. And the inadequate and embarrassing analogies we use to describe our brains do little to help us see ourselves...
When the next Sunday came around, I was just as anxious about relaxation and relaxed about anxiety as I had been the week before. I still didn’t know whether I wanted to go shopping. Other times I thought I was thinking about death, though I couldn’t be sure. Who knows, maybe I would never know. I might even die that w...
As the Trump Administration enters its first hundred days, the 2016 election and its unexpected result remains a central topic of discussion among journalists, researchers, and the public at large. It is notable the degree to which Trump’s victory has propelled a broader, wholesale evaluation of the defects of the mode...
This isn’t the first time that the internet has figured prominently in a presidential win. Among commentators on the left, the collective pessimism about the technological forces powering Trump’s 2016 victory are matched in mirror image by the collective optimism about the technological forces driving Obama’s 2008 vict...
This troubled internet has been around for years. Fears about filter bubbles facilitating the rise of the alt-right can and should be linked to existing concerns about the forces producing insular, extreme communities like the ones driving the Gamergate controversy. Fears about the impotence of facts in political debat...
The Wise Crowd The Obama and Trump elections might be read as the bookends of a story about the impact of the internet on society. How do we size up the nearly ten years between 2008 and 2016? How do we understand what happened on the internet during that time, and the ripple effect it had on the public sphere? We can ...
One critical anchor point is the centrality of the wisdom of the crowd to the intellectual firmament of Web 2.0: the idea that the broad freedom to communicate enabled by the internet tends to produce beneficial outcomes for society. This position celebrated user-generated content, encouraged platforms for collective p...
Inspired by the success of projects like the open-source operating system Linux and the explosion of platforms like Wikipedia, a generation of internet commentators espoused the benefits of crowd-sourced problem-solving. Anthony D. Williams and Don Tapscott’s Wikinomics (2006) touted the economic potential of the crowd...
Faith in the collective intelligence of the crowd didn’t go unchallenged. Contemporary authors like Andrew Keen railed against the diminishing role of experts in The Cult of the Amateur (2007). Jaron Lanier’s You Are Not A Gadget (2010) warned of individual intelligence being replaced by the judgment of crowds and algo...
Yet regardless of the critics, the belief in the wisdom of the crowd framed the design of an entire generation of social platforms. Digg and Reddit—both empowered by a system of upvotes and downvotes for sharing links—surfaced the best new things on the web. Amazon ratings helped consumers sort through a long inventory...
Intelligence Failure The platforms inspired by the “wisdom of the crowd” represented an experiment. They tested the hypothesis that large groups of people can self-organize to produce knowledge effectively and ultimately arrive at positive outcomes. In recent years, however, a number of underlying assumptions in this f...
First, the wisdom of the crowd assumes that each member of the crowd will sift through information to make independent observations and contributions. If not, it hopes that at least a majority will, such that a competitive marketplace of ideas will be able to arrive at the best result. This assumption deeply underestim...
Second, collective intelligence requires aggregating many individual observations. To that end, it assumes a sufficient diversity of viewpoints. However, open platforms did not generate or actively cultivate this kind of diversity, instead more passively relying on the ostensible availability of these tools to all.
There are many contributing causes to the resulting biases in participation. One is the differences in skills in web use across different demographics within society. Another is the power of homophily: the tendency for users to clump together based on preferences, language, and geography—a point eloquently addressed in...
Third, collective intelligence assumes that wrong information will be systematically weeded out as it conflicts with the mass of observations being made by others. Quite the opposite played out in practice, as it ended up being much easier to share information than to evaluate its accuracy. Hoaxes spread very effective...
Crowds also arrived at incorrect results more often than expected, as in the high-profile misidentification of the culprits by Reddit during the Boston Marathon bombing. The failure of the crowd to eliminate incorrect information, which seemed sufficiently robust in the case of something like Wikipedia, did not apply t...
Fourth, collective intelligence was assumed to be a vehicle for positive social change because broad participation would make wrongdoing more difficult to hide. Though this latter point turned out to be arguably true, transparency alone was not the powerful disinfectant it was assumed to be. The ability to capture poli...
The resulting ecosystem feels deeply out of control. The promise of a collective search for the truth gave way to a pernicious ecosystem of fake news. The promise of a broad participatory culture gave way to campaigns of harassment and atomized, deeply insular communities. The promise of greater public accountability g...
Reweaving the Web So, what comes next? Has a unique moment been lost? Is the ecosystem of the web now set in ways that prevent a return to a more open, more participatory, and more collaborative mode? What damage control can be done on our current systems? It might be tempting to take the side of the critics who have l...
It would also miss the complex changes to the internet in recent years. For one, the design of the internet has changed significantly, and not always in ways that have supported the flourishing of the wisdom of the crowd. Anil Dash has eulogized “the web we lost,” condemning the industry for “abandon[ing] core values t...
The wisdom of the crowd’s critics also ignore the rising sophistication of those who have an interest in undermining or manipulating online discussion. Whether Russia’s development of a sophisticated state apparatus of online manipulation or the organized trolling of alt-right campaigners, the past decade has seen ever...
To the extent that the vision of the wisdom of the crowd was naive, it was naive because it assumed that the internet was a spontaneous reactor for a certain kind of collective behavior. It mistook what should have been an agenda, a ongoing program for the design of the web, for the way things already were. It assumed ...
In short, the wisdom of the crowd didn’t describe where we were, so much as paint a picture of where we should have been going. Fulfilling those failed aspirations will require three major things. Platforms must actively protect the crowd’s production of wisdom. The visibility of collective decision-making and the dram...
The mission needs to be drawn broader than code. Ensuring that the wisdom of the crowd can produce social change means creating pathways for offline action that can effectively challenge wrongdoing. Ensuring that the wisdom of the crowd can reach accurate results requires more inclusive, diverse bodies of participants....
Experimentation must be accelerated at the edges. Although we depend heavily on a few key platforms, the internet is still a vast space. Today’s platforms emerged from experimentation at the edges. To produce new generation of robust platforms, we need more experimentation—a proliferation and wide exploration of altern...
It remains an open question whether the internet is traveling down the same, well-worn paths followed by all communications infrastructures, or whether it represents something truly new. But to accept the current state of affairs as inevitable falls prey to a fatalistic pessimism that would only further compound the pr...
The vision of collective participation embedded in the idea of the wisdom of the crowd rests on the belief in the unique potential of the web and what it might achieve. Even as the technology evolves, that vision—and a renewed defense of it—must guide us as we enter the next decade. Technology has a gender problem, as ...
It’s not always obvious to outsiders, but the term “technology sector” is a catch-all for a large array of distinct jobs. Of course there are PR, HR, and management roles. But even if we confine ourselves to web development, technical people often distinguish among “front-end,” “back-end,” and “full-stack” development....
In practice, the distinction is murky: some developers refer to everything user-facing as the front-end, including databases and applications, and some developers use front-end to mean only what the user sees. But while the line shifts depending on who you’re talking to, most developers acknowledge its existence.
I spoke to a number of developers who confirmed something I’d sensed: for some time, the technology industry has enforced a distinct hierarchy between front-end and back-end development. Front-end dev work isn’t real engineering, the story goes. Real programmers work on the back-end, with “serious” programming language...
Are women really more likely to be front-end developers? Numbers are hard to pin down. Most studies consider the tech sector as a single entity, with software engineers lumped together with HR professionals. A 2016 StackOverflow user survey showed that front-end jobs—“Designer,” “Quality Assurance,” and “Front-End Web ...
We need better numbers, as feminist developers have been saying for years, but it also doesn’t seem like a huge stretch to take developers at their word when they say that front-end development is understood to occupy the girlier end of the tech spectrum. Front-end developers, importantly, make about $30,000 less than ...
Sorting the Stack The distinction between back and front wasn’t always so rigid. “In the earliest days, maybe for the first ten years of the web, every developer had to be full-stack,” says Coraline Ada Ehmke, a Chicago-based developer who has worked on various parts of the technology stack since 1993. “There wasn’t sp...
For many people who are teaching themselves to code, front-end work is the lowest-hanging fruit. You can “view source” on almost any web page to see how it’s made, and any number of novices have taught themselves web-styling basics by customizing WordPress themes. If you’re curious, motivated, and have access to a comp...
Which is not to say it’s easy, particularly at the professional level. A front-end developer has to hold thousands of page elements in her mind at once. Styles overwrite each other constantly, and what works on one page may be disastrous on another page connected to the same stylesheet. Front-end development is taxing,...
“Serious” developers often avoid acknowledging this by attributing front-end expertise not to mastery but to “alchemy,” “wizardry,” or “magic.” Its adepts don’t succeed through technical skill so much as a kind of web whispering: feeling, rather than thinking, their way through a tangle of competing styles.
“There’s this perception of it being sort of a messy problem that you have to wrangle with systems and processes rather than using your math-y logic,” says Emily Nakashima, a full-stack developer based in San Francisco. That’s not true, of course; nothing on a computer is any more or less logical than anything else. Bu...
The gendered attributes switch as you travel to the back of the stack. At the far end, developers (more often “engineers”) are imagined to be relentlessly logical, asocial sci-fi enthusiasts; bearded geniuses in the Woz tradition. Occupations like devops and network administration are “tied to this old-school idea of y...
“If you’re worried about your professional status, one way to police gender boundaries is through educational credentials,” says Ensmenger. “The other way, though, is genius. And that’s something I think nerd culture does really well. It’s a way of defining your value and uniqueness in a field in which the relationship...
When programming professionalized, women got pushed out. Marie Hicks, a computing historian who’s looked closely at this phenomenon, explains that as programming came to be viewed as more important to national and corporate welfare, hiring managers began associating it with a specific set of skills. In the British case...
The Dangerous Myth of Meritocracy The case of the female front-end developer is flipped in the other direction—it’s a feminizing subfield, rather than a masculinizing one. But it’s governed by many of the same market forces that edged women out of programming in the first place: prestige accrues to labor scarcity, and ...
No one says any of this explicitly, of course, which is why the problem of women in technology is thornier than shoehorning women onto all-male panels. The developers I spoke to told me about much more subtle, very likely unconscious incidents of being steered toward one specialization or another. Two different women t...
And everyone can rattle off a list of traits that supposedly makes women better front-end coders: they’re better at working with people, they’re more aesthetically inclined, they care about looks, they’re good at multitasking. None of these attributes, of course, biologically inhere to women, but it’s hard to dispute t...
Once you’re cast as a front-end developer, it can be challenging to move to different parts of the stack, thus limiting the languages and development practices you’re exposed to. “Particularly in Silicon Valley, there’s a culture of saying developers should always be learning new things,” says Nakashima, the San Franci...
Hicks, the computing historian, can’t stand it when people tout coding camps as a solution to technology’s gender problem. “I think these initiatives are well-meaning, but they totally misunderstand the problem. The pipeline is not the problem; the meritocracy is the problem. The idea that we’ll just stuff people into ...
In more ways than one, medicine is dying. A 2015 article in JAMA: The Journal of the American Medical Association suggests that almost a third of medical school graduates become clinically depressed upon beginning their residency training. That rate increases to almost half by the end of their first year.
Between 300 and 400 medical residents commit suicide annually, one of the highest rates of any profession, the equivalent of two average-sized medical school classes. Survey the programs of almost any medical conference and you’ll find sessions dedicated to contending with physician depression, burnout, higher-than-ave...
At the risk of sounding unsympathetic, medicine should be difficult. No other profession requires such rigorous and lengthy training, such onerous and ongoing scrutiny, and the continuous self-interrogation that accompanies saving or failing to save lives. But today’s crisis of physician burnout is the outcome of more ...
The Rise of the Electronic Medical Record An electronic medical record, or EMR, is not all that different from any other piece of record-keeping software. A health care provider uses an EMR to collect information about their patient, to describe their treatment, and to communicate with other providers. At times, the EM...
And if that’s all there were to it, a doctor using an EMR would be no more worrisome than an accountant switching out her paper ledger for Microsoft Excel. But underlying EMRs is an approach to organizing knowledge that is deeply antithetical to how doctors are trained to practice and to see themselves. When an EMR imp...
When building a tool, a natural starting point for software developers is to identify the scope, parameters, and flow of information among its potential users. What kind of conversation will the software facilitate? What sort of work will be carried out? This approach tends to standardize individual behavior. Software ...
Yet medicine is uniquely allergic to software’s push towards standards. Healthcare terminology standards, such as the Systematized Nomenclature of Medicine (SNOMED), have been around since 1965. But the professional consensus required to determine how those terms should be used has been elusive. This is partly because ...
More acutely, medicine avoids settling on a shared language because of the degree to which it privileges intuition and autonomy as the best answer to navigating immense complexity. One estimate finds that a primary care doctor juggles 550 independent thoughts related to clinical decision-making on a given day. Though t...
Over the last several years, governments, insurance companies, health plans, and patient groups have begun to push for greater transparency and accountability in healthcare. They see EMRs as the best way to track a doctor’s decision-making and control for quality. But the EMR and the physician are so at odds that rathe...
Inputting information in the EMR can take up as much as two-thirds of a physician’s workday. Physicians have a term for this: “work after clinic,” referring to the countless hours they spend entering data into their EMR after seeing patients. The term is illuminating not only because it implies an increased workload, b...
The EMR causes an excruciating disconnect: from other physicians, from patients, from one’s clinical intuition, and possibly even from one’s ability to adhere faithfully to the Hippocratic oath. And, if the link between using a computer and physician suicide seems like a stretch, consider a recent paper by the American...
Drop-down menus and checkboxes not only turn doctors into well-paid data entry clerks. They also offend medical sensibility to its core by making the doctor aware of her place in an industrialized arrangement. From Snowflake to Cog Physicians were once trained through an informal system of apprenticeship. They were ove...
End of preview. Expand in Data Studio

The text of all the articles from Logic Magazine issues 1-18.

logic_raw.txt - The articles are separated by three newlines. Each paragraph is on its own line.

logic_passages.txt - The articles, broken up into passages of between 300 to 2000 characters. Each passage is on its own line.

Downloads last month
5