Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

Saturday, January 20, 2018

Is Apple Responsible for the Well-Being of Our Kids?

I was surprised to see that the most-viewed article on the blog this week was one that I wrote almost 5 years ago, What to Do if Your Kids Are Obsessed with Technology. (Thanks to whoever shared that on Facebook!) Reading it again, it seems to hold up fairly well, and I would still offer the same advice to parents wondering what to do if their young one seems too drawn to screens.

However, many things have changed since 2013. Smartphones have gotten so ubiquitous that every teenager I see has one, and most children older than 8 or 9 seem to have one as well (and if not a smartphone, then almost certainly a tablet or Chromebook). Snapchat and Instagram have gotten ever more entrenched as the platforms of choice for young people’s socializing and selfie-expression, and games with addictive mechanisms have proliferated like weeds. We’re even having a cultural conversation about whether a generation has been destroyed because of the effects of smart devices. I don’t think it’s gone quite that far, but I hear constantly from parents about the difficulty they have trying to separate their kids from their screens. 

Now, I am an old-school Apple fan, using Apple computers and devices almost exclusively ever since my first experiences with an Apple II in elementary school. So it was with great interest that I saw the recent headlines about investors calling for Apple to look into how their technology may be harming kids and to mitigate any potential harms. Then, earlier this week, Farhad Manjoo went even further with an article in the New York Times about how Apple can help save all of us, adults included, from the attention-grabbing consequences of their technology by building a “less-addictive iPhone.” I agree with Manjoo that this represents a great opportunity for Apple, since their business model does not depend primarily on people using their devices nonstop. However, designing software to be less addictive for all is a much more complicated issue than putting in better parental controls for minors, so I’m going to focus on the latter for rest of this post.

Ultimately, I think the answer to the question posed in the title of this article is that parents are responsible for their kids, but parents need help, and Apple can do a lot more to make it easier for parents to set appropriate limits. Even for savvy parents who do not allow screens in bedrooms, sometimes the lure of the device is so tempting that a kid would sneak it into their rooms at night. Many parents I work with try to set screen time limits, but they can’t keep watch on their kids all the time, and it’s hard for a parent to know how much time a kid is spending watching videos vs. playing games vs. working on homework. And even when a parent can accurately track the time and tells a child to stop using the device, this often leads to arguments and fights if the child is super-engaged in what they’re doing (I probably see a biased sample of kids who tend to get very irritable when this happens). Plus, I’m sure that Apple can come up with a much more elegant solution than a lockbox with a timer

Most of us have probably heard by now that defaults matter, whether it’s for organ donation rates, food choices, or 401k participation. And right now the default when setting up a new iPhone or iPad is that the user is all-powerful; she can access all apps, all sites, and use the device at all hours of the day and night. Stricter controls have to be manually enabled, which many parents simply do not do. Also, Apple’s current parental controls (under Settings -> General -> Restrictions) are rudimentary: specific apps can be restricted, and if a child tries to download a new app from the App Store, parents can choose to get an alert on their device that would allow them to approve or deny the purchase. There is also a content filter for apps, movies, and music that restricts adult content, and a website filter that blocks access to adult sites.

This is not a bad start, but far from adequate in today’s environment. I think Apple should ask during the initial setup of a new device whether the device is intended for a minor, and if so, the age of the child. With that info, Apple should then set defaults (which the parent can always change later) that are age-appropiate, in line with expert recommendations on screen time, gaming, and how much sleep kids need. If a device is in “kid mode” and it runs up against preset time limits, it should give the user a warning 5 minutes and 1 minute before the time limit is reached, so there will not be any surprises when the user gets locked out of what they’re doing. The device should also lock itself 1 hour before bedtime. If the child wants to use it past a time restriction, the parent would have to grant permission on a case-by-case basis. 

Here’s an example of what I think might be roughly appropriate for 2 broad age groups:

Kids (6-12)
Video watching: 30 mins
Games: 30 mins
Nighttime: No use after 8pm
Apps: All apps (except Phone, Mail, Messages, Music, Photos) initially restricted; parents can manually enable other apps
Contacts: Only allow calls/texts/email with approved contacts
Content: Block adult content and websites

Teens (13-18)
Video watching: 1 hr
Games: 1 hr
Social media apps: 1 hr
Total use of above categories: 2 hrs
Nighttime: No use of most apps after 10pm, but can play music or podcasts
Apps, Contacts, Content: Less restrictive than for kids, but parents should have an easy way of seeing how much time is spent in different apps.

If you think I’m being too strict, then you probably haven’t been paying attention to how much tech industry executives tend to limit their children's access to devices. As I said on the Twitter:

I was encouraged recently when Apple took a step to look after for its users’ interests by requiring that game developers disclose the odds in games that have gambling-style mechanics. Up to now, Apple may have viewed parental control software as a third party opportunity, creating an opening for successful businesses like OurPact or Disney's Circle. But Apple also has a long tradition of “sherlocking,” in which they steal the best features of a third party product and incorporate them into their operating system. When it comes to setting better defaults for kids, I would encourage Apple to sherlock away!

Saturday, December 9, 2017

My Free-to-Play Gaming Postmortem

So there was this period of time from mid-September 2015 to mid-October 2017 in which I didn’t write a single blog post. What happened? This post is my attempt to reflect on my hiatus from blogging.

The most convenient answer—and the one most friendly to my ego—is that I had simply gone through some Major Life Changes that got in the way of devoting time to this blog. However, if I dig deeper, I must admit to myself that October 2015 is when I started playing a free Japanese mobile game called Puzzle and Dragons (PAD), and October 2017 was when I started to get tired of playing it; I finally deleted the game from my phone last week.

Left: A random person's monster collection. Right: a monster card in all its glory.
The basic gist of PAD is that you assemble a team based on different “monster cards,” each of which has different properties. You obtain the best cards by spending “magic stones” on a Rare Egg Machine, which pops out a random monster card at a cost of 5 stones. The stones can be earned for free by beating levels in the game or purchased for $0.99 each (or only $59.99 for 85!). With your team, you fight your way through various dungeons, doing damage to the enemies based on how many orbs of the same color you can match in rows or columns of at least 3 on the game’s puzzle board. As with any decently fun game, it felt rewarding to finally beat a difficult level after multiple tries. And the artwork and graphics, hand-drawn by Japanese artists, were top-notch. But in addition to these basic features shared with most games, PAD has many mechanisms that increase its ability to grab ahold of players’ attention, time, and money, and these psychological manipulations are very clear for me to see in retrospect.

Left: Narrowly escaped death from the enemy's attack. Right: My team doing some serious damage to Kali.
The game frequently gives away magic stones and other goodies for free, using reciprocity to make a player feel motivated (or obligated) to keep playing. Also, you get more rewards the more consecutive days you log in, which helps players make the game a daily habit à la Snapstreaks. The most powerful cards, of course, are very rare, so there’s intermittent variable interval reinforcement when you get lucky and land a good card. There are special events every few weeks called “Godfests,” which are the only times players can get certain rare cards, creating some serious FoMO. Once you’ve invested time and energy to assemble a nice collection, there’s a strong tendency towards loss aversion, as no one wants to feel like they’ve wasted all this time for nothing. Since most of PAD’s players are young men, many of the most desirable cards feature scantily-clad female characters, a.k.a. “waifus.” And there’s a community aspect as well, with multiple forums devoted to the game where players share their accomplishments and good Godfest luck, leading to upward comparisons and social reinforcement.

Despite all that, I’m not sure I would say that I was addicted to the game in a clinical sense. I was spending ~30-60 minutes a day playing the game, and maybe another half hour a day reading about it. My personal relationships and work did not suffer, as far as I can tell. Over the course of 2 years, I spent a grand total of $10 on in-app purchases of magic stones. On PAD forums there are reports of “whales” who've spent upwards of thousands of dollars on the game, so I got off relatively easy, at least in a financial sense.

Still, how PAD affected my mind is undeniable. Instead of reading blogs related to psychiatry and mental health, I was reading blogs and watching YouTube channels related to PAD. I stopped even thinking about my blog, and every time I had a spare moment, I would open the PAD app instead of taking in my surroundings or reading a book. In fact, I read far fewer books in 2016 and 2017 compared to any other year in my life since I learned to read, though part of that may be due to reading more on the web. It wasn’t all bad, though. I wasted far less time on Twitter, and I was no longer waking up in the middle of the night with ideas for blog posts. I had a convenient and pleasant distraction from politics. And I’ve spent much less mental energy these last 2 years obsessing about my fantasy football teams than I have in previous years.

So what finally made me stop? A part of it was the fact that PAD’s creators are constantly adding more difficult dungeons, which in turn require ever more powerful (and rare) monster cards to deal with. Playing the game started to feel increasingly like a Sisyphean task. I’d also like to think that a part of me missed blogging and reading books. Recently, I came across the philosophically-oriented Slate Star Codex blog, written by a young psychiatrist, and I thought, “If he can write several 2000 to 5000-word blog posts in a week, then why can’t I be even 3% as productive (i.e. roughly a 1000-word post per month)?”

Lastly, some advice for parents out there: as fun as Super Mario Bros was for us in our youth, it does not remotely compare to the reinforcement mechanisms that today’s microtransaction-driven mobile games employ. I’ve heard multiple stories from parents about their kids stealing their credit cards to spend hundred of dollars on in-app purchases for games like Clash of Clans and Clash Royale. I now believe that parents should not be letting their kids play games like these, which all tend to use similar attention and money-grabbing tactics. As a general rule, this applies to any of the mobile games that you see advertised on TV; how else would those game companies have so much money to spend on prime time ad spots? Recently, regulators in The Netherlands have started investigating whether games that have “loot boxes” (a similar idea to the Rare Egg Machine) are a form of gambling and should be regulated as such. In my mind there is no doubt that these games can work very similarly to gambling, except you can’t actually win any money, so it’s in a way worse than gambling.

Even if there are kids who can responsibly play these games without spending too much time or money, I would still strongly suspect that these games have an outsized influence on what their players think about—and stop thinking about—even when they’re not playing. And for me, that was ultimately the biggest negative impact.

Wednesday, November 8, 2017

How Making Consumers Happy Got Us Here

If you’ve never seen Malcolm Gladwell’s 2004 TED Talk: “Choice, happiness and spaghetti sauce,” please take a moment to check it out:


In the talk, Gladwell focuses on the work of “someone who, I think, has done as much to make Americans happy, as perhaps anyone over the last 20 years, a man who is a great personal hero of mine, someone by the name of Howard Moskowitz, who is most famous for reinventing spaghetti sauce.”

Gladwell goes on to describe Moskowitz’s key insight in coming up with chunky pasta sauce for Prego, which is that there is no single sauce that is perfect for everyone, but there is a perfect sauce for each individual consumer. As the saying goes: “The customer is always right.” Thus, the explosion from just Prego vs Ragu to different varies of Prego and Ragu to the cornucopia of choices we have for pasta sauce today.

But as the sauce went, so went everything else. We no longer have to suffer through the primitive days of ABC’s Wide World of Sports or just one cable sports channel. There’s ESPN 2 (and 3 and Classic), FS1, NBCSN, CBSSN, even channels devoted to motorsports or golf. Long gone are the days of everyone tuning in to Walter Cronkite for the day's news. Instead, everyone can find the talking head who agrees most with their personal views and never have to be inconvenienced by a dissenting view.

Online, we no longer have to be exposed to the same reality or set of facts. Facebook, YouTube, Google News, et al. make it so we don’t even have to go out of our way to search out those with similar views; these behemoths feed us stuff based on all the data they have gathered from tracking our online behavior. Of course, they’re doing this to make us happy, but what price are we paying for this sort of happiness?

Which brings us, inevitably, to the present political situation. I think it’s obvious that our current president would not be in office if not for this drive to feed consumers only what they want to see, hear, and experience. Much has been made about how the Russians took advantage of people’s news feeds to try to drive Americans further apart. However, even without foreign interference, I believe that modern America’s brand of consumerism is damaging to all Americans, young and old, left and right. It tells the consumer, “only you and what you want matters.” This kind of implicit message inevitably leads to inflated egos all around, self-selection into smaller and smaller interest groups, and less of a willingness to see things from another perspective. Not surprisingly, frustration, anger, and inability to compromise are the result when people who are used to shaping their own reality are confronted by realities determined by others with different beliefs, such as when a black president gets elected or a controversial speaker gets invited to speak at a college campus.

Is fixing our system (which I believe involves fixing our culture) even possible at this point? Once the Pandora’s Box of unlimited choice for the consumer has been opened, is there any going back to the spirit of “ask not what your country can do for you—ask what you can do for your country”? In my less hopeful moments, I think that it would take some sort of unimaginable catastrophe—like the Great Depression bringing an end to the Roaring 20’s—to drive people to put sufficient effort into overcoming the centrifugal forces that are splitting us apart. Yet, as Tom Hanks said, “If you’re concerned about what’s going on today, read history and figure out what to do because it’s all right there.” I'm not sure if he was thinking about a specific historical era, but what came to my mind was what happened during the Renaissance Papacy, when the Popes became so focused on worldly riches, pleasure, and power that they lost their religious legitimacy, leading directly to the Protestant Reformation.

From the Wikipedia entry:
The popes of this period used the papal military not only to enrich themselves and their families, but also to enforce and expand upon the longstanding territorial and property claims of the papacy as an institution. […] With ambitious expenditures on war and construction projects, popes turned to new sources of revenue from the sale of indulgences and of bureaucratic and ecclesiastical offices. […] The popes of this period became absolute monarchs, but unlike their European peers, they were not hereditary, so they could only promote their family interests through nepotism.
That period of the papacy lasted roughly a century before the Reformation forced Catholicism to reform itself. Yes, there were bloody religious wars as a result of the split in Western Christianity, and peace between Catholics and Protestants took centuries to achieve in some places. And some pundits argue that the Reformation created as many horrors as it addressed. But the overall (admittedly simple) lesson I get from this history is that there are many potential Martin Luthers out there, waiting to change the world, even if inadvertently. I just hope we don’t have to wait a hundred years for that to happen.

Sunday, October 15, 2017

About That Jean Twenge Smartphone Article

Note: As you may notice, this is my first blog post in over 2 years. This blog isn’t dead, it was just resting! I’m thinking about writing a post about why I haven’t blogged in so long; maybe it'll even be done less than 2 years from now.

For the past 3 months, my most frequently-visited blog post has been my critique of Jean Twenge’s claims of a “narcissism epidemic” from 2013. Google tells me that it’s one of the top results in searches for “jean twenge criticism.” So in this post, I would like to share my views on her latest work.

I read her article for The Atlantic, “Have Smartphones Destroyed a Generation?” (an excerpt from her new book iGen), the day that it came out, because as a psychiatrist who works with adolescents, it’s clear to me that smartphones have been changing their lives in ways profound and subtle. I actually like many aspects of the article, including her sympathetic portrayal of the complexity that smartphones have brought into teenagers’ already complex lives. Also, I think she presents the data well, and the data sources that she uses are nationally representative surveys that have been around a long time and are well-respected. I think she makes a rather convincing case that many teens today are living their social lives online rather than hanging out with their friends in person.

However, I do have some criticisms of Twenge’s far-reaching claims about the effects of smartphones, but first, that ridiculous title:

Thankfully, it seems the author agrees:

One criticism that others have voiced is that Twenge seems to draw conclusions based primarily on the correlation between the rise in smartphone use and increases in mental health issues in teens over the same span. While she acknowledges that many of the trends she highlights, such as adolescents taking longer to take on adult responsibilities, predate the introduction of smartphones, she sees the rising use of smartphones as some sort of inflection point. But there are so many other trends going on in our culture, including parents becoming more over-protective, rising political/racial/economic divides, etc., to pin the blame on smartphones seems overly facile. Heck, if I were being cheeky, I would point out that the sale of yoga pants has drastically increased since 2011, corresponding to increases in teen depression:

Source: Business Insider

But just because they correlate does not mean that one has anything to do with the other.

Thus, my biggest criticism stems from Twenge’s seeming certainty about the decisive role of smartphones coupled with a seeming lack of curiosity about examining deeper causes for why these trends are happening. In a recent NY Times Magazine article about increasing rates of anxiety in teens, Twenge had this to say:
“The use of social media and smartphones look culpable for the increase in teen mental-health issues,” [Twenge] told me. “It’s enough for an arrest — and as we get more data, it might be enough for a conviction.”
I’m sorry, but I think the situation is closer to her finding evidence of a crime, and possibly even a weapon, but she is nowhere near identifying—much less convicting—a suspect.

I find it vexing that Twenge seems to view each generation as a distinct and determinative entity, rather than some arbitrary line drawn by demographers, as she shows in this tweet:
That is just preposterous. She writes as though each generation somehow pops into existence with its own innate characteristics, rather than being influenced by—and reacting to—the generations that have come before. However, one of the few things I know for sure is the huge extent to which young people are influenced by their elders. For example, this recent article (also in The Atlantic, lol) highlighted just how much even 1-year-old infants learn from observing the actions of the adults around them. And what are kids observing these days?

In my own practice, I often hear from kids and teens who say that their parents are on their laptops checking work email or on their phones checking Facebook all the time. These kids are bored and lonely, so as soon as they have access to a smart device, what do they do? Twenge’s work makes it easy for parents to blame the devices and not think about how their own actions may be influencing their children. While that may protect parents’ egos and sell more books, it’s a very incomplete and misleading picture, to say the least.

I have not yet read Twenge’s new book, but I was hoping that it would take a deeper look at the culture as a whole, especially the critical role that parents can play in changing the situation. However, one look at the book’s table of contents reveals that only the last 26 pages are devoted to a chapter on “Understanding—and Saving—iGen”. This scathing review from NY Mag further breaks down the book and the motivations of the author. The reviewer takes the view that Twenge is less a scholar who investigates all aspects of a complex issue than she is someone positioning herself as a guru for marketers looking to understand the latest generation of teens.

In conclusion, while there certainly is a mental health crisis going on in today’s teens (and adults!) and pinning the blame on smartphones is understandable, I believe it will take far more than getting rid of everyone’s favorite devices to make our culture healthier for future generations.

Saturday, September 5, 2015

Who Controls the Future of Medical Knowledge? Part I

The recent discontent amongst physicians regarding the process of maintaining board certification in various specialties got me thinking about a broader question: how do doctors acquire new medical knowledge, especially after medical school? Which brings me to an even more critical question: who controls said knowledge?

I would argue that next to our ability to listen to and empathize with patients, the other most valuable aspect of the medical profession is our knowledge. Ever since the days of Hippocrates, medical knowledge has been transmitted from one doctor to another in essentially the same way. In medical school and residency, we attend lectures, read textbooks, study cases, answer Socratic questions posed by more experienced clinicians, and most importantly, learn by seeing numerous patients and accumulating experience. After graduating medical school, it seems that most doctors learn by conferring with one another, reading journals, and attending conferences.

But the more information there is, the more time it takes to access and acquire new knowledge, and the harder it becomes for individual physicians to keep up.

You can be sure that corporations are well aware of this. On the patient side, of course, Dr. Google already provides incredible ease of access to knowledge and profits handsomely from selling ads to consumers. Pharmaceutical companies know more about my prescribing practices than I do, which fuels their targeted marketing efforts. More ambitiously, IBM's Watson Health Cloud promises to "bring together clinical, research and social data from a diverse range of health sources, creating a secure, cloud-based data sharing hub, powered by the most advanced cognitive and analytic technologies." And as much as I panned athenahealth's advertising in an earlier post, the electronic medical record companies will certainly find clever ways of profiting from the vast troves of health care data that they accumulate. And doctors are paying for the privilege of providing that information to them!

At least SERMO ("the most trusted and preferred social network for doctors") pays doctors for completing surveys, but you can be sure that they're in the same game. They keep their service free by monetizing the attention and knowledge of doctors: "Organizations seeking physician expertise, such as pharmaceutical companies, medical device firms, and biotechs, underwrite the market research and sponsorship opportunities within our site."

So what options are available for doctors who want to share their knowledge with each other free from the confines of a data mining operation? Of course, we can still consult with colleagues the old fashioned way, either in person or by phone. But after having these conversations, the knowledge still resides in the brains of people, not easily accessible to future doctors who may run into similar situations. Our professional associations post practice guidelines that hardly anyone reads, and at annual meetings, there are opportunities to meet with expert clinicians to discuss cases, which seems terribly inefficient. What about higher-tech options? There are numerous subscription services that provide summaries of research studies, but I believe that the patients doctors see do not necessarily resemble those who sign up for clinical trials. There are electronic mailing lists in which doctors can discuss cases, and which allow members to search through previous conversations. And there's wikidoc, a free wikipedia for doctors. However, these options are used by very few doctors and are paltry efforts next to the commercial ambitions of Big Data.

With all these business interests aiming to aggregate and profit from the knowledge of doctors, is there anything that the medical profession can do to avoid having our knowledge become some company's proprietary intellectual property?

I don't claim to have the answers, but I will explore some ideas in Part II. Stay tuned…

Monday, April 27, 2015

The Most Popular Psychiatrists in America (According to Twitter)

All the recent hubbub over Dr. Memhet Oz got me thinking more about fame when it comes to medical doctors: how they gained their popularity, to what end they employ their platforms, and how they keep (or don't keep) their professional integrity. One of the easiest ways to estimate popularity is to look at how many people follow an individual on Twitter. There, Dr. Oz is clearly way ahead of the practicing physician pack with 3.75 million followers. Dr. Drew Pinsky is second at 3.16M, while CNN's Dr. Sanjay Gupta is a distant third with 1.98M. In comparison, well-known blogger Dr. Kevin Pho "only" has 122K followers.

Curious about who the most popular psychiatrists are, I searched Twitter for individuals (not organizations) with profiles matching "psychiatrist" on 4/26/15. I examined the first 100 or so profiles written in English, looking at the follower count and selecting the 4 psychiatrists with the most followers for further scrutiny (and speculation), focusing on the nature of their popularity and just how much B.S. they espouse. Here's what I found:

#4: Judith Orloff (40.0K followers | following 10.3K)

Claim to fame: According to her Twitter profile, Dr. Orloff is a "psychiatrist, intuitive healer, and author of THE ECSTASY OF SURRENDER about how to let go of stress, trust intuition, and embrace joy." She has also written other books with titles such as [her CAPS]: EMOTIONAL FREEDOM, POSITIVE ENERGY, INTUITIVE HEALING, and SECOND SIGHT. I have never heard of her or any of her books; judging by their descriptions, they are very much targeted toward a non-scientifically-minded audience (which is to say, just about everyone).
B.S. meter: 7 poo. Dr. Orloff's about page emphasizes the power of intuition to help us "heal—and prevent—illness" and is full of quotations describing her as "a prominent energy-based healer" and a "positive energy guru." I have no doubt that she is a great psychiatrist who helps her patients and readers feel better, and I happen to agree with the message in her latest book about the importance of letting go as opposed to "pushing, forcing, and over controlling people and situations." Yet my intuition tells me that anyone who promotes herself with a sentence like "Dr. Orloff is accomplishing for psychiatry what physicians like Dean Ornish and Mehmet Oz have done for mainstream medicine" needs to be approached with a healthy dose of skepticism.

#3: Daniel Amen (78.7K followers | following 29.3K)

Claim to fame: Frankly, I was surprised that he was not #1. He's the only psychiatrist that I immediately recognized out of the 4 I found doing this search and the only one with a verified Twitter account, which Twitter only bestows upon "key individual and brands." Dr. Amen is the founder of Amen Clinics, which uses SPECT brain scans to purportedly diagnose mental disorders. He has been featured in programs running on PBS, and he even has influence amongst Christian audiences. In 2012, a Washington Post article called him "the most popular psychiatrist in America."
B.S. meter: 8 poo. There have been numerous well-articulated criticisms of Dr. Amen and his ridiculous claims regarding SPECT scans that I won't rehash here, save for one especially galling fact: his clinic charges $3500 for an initial evaluation and SPECT scan, which is generally not covered by insurance. While the clinic's website does not reveal this cost up front, it does say they've done over 100,000 scans, so you do the math. PBS's own ombudsman has disavowed any association with Dr. Amen's infomercials that were aired by local PBS affiliates without adequate disclaimers. Dr. Jeffrey Lieberman, former president of the American Psychiatric Association, was quoted in the Washington Post article as saying this about Amen: "In my opinion, what he’s doing is the modern equivalent of phrenology." On that point, Dr. Lieberman and I can agree.

#2: Brian Weiss (80.7K followers | following 25)

Claim to fame: Dr. Weiss's website tells us he "was astonished and skeptical when one of his patients began recalling past-life traumas that seemed to hold the key to her recurring nightmares and anxiety attacks. His skepticism was eroded, however, when she began to channel messages from 'the space between lives,' which contained remarkable revelations about Dr. Weiss's family and his dead son. Using past-life therapy, he was able to cure the patient and embark on a new, more meaningful phase of his own career." He is the author of books such as Miracles Happen: The Transformational Healing Power of Past Life Memories, and Many Lives, Many Masters: The True Story of a Prominent Psychiatrist, His Young Patient, and the Past-Life Therapy That Changed Both Their Lives. Not surprisingly, his homepage prominently features a photo of him and Oprah. He runs 5-day workshops costing $1000/person for "anyone interested in exploring these profound psychospiritual techniques."
B.S. meter: 10+ poo. Someone in a past life once told me, "If you ain't got nothin' nice to say, then it's better to say nothin' at all." I will stick with that for my current life and any of my future lives…

Thus far, the trend seems to be greater popularity correlating with ever escalating levels of B.S. I was losing what little faith I had entering this exercise. So I was shocked by who ranked first:

Dr. Tobias Fünke

For a moment, I thought I was looking at Dr. Tobias Fünke from Arrested Development. But no, it's actually this guy:

Dr. Normal Rosenthal

#1: Norman Rosenthal (101K followers | following 28.3K)

Claim to fame: I have never heard of Dr. Rosenthal before, but he is the only psychiatrist I can find with over 100K followers. According to his website, he "has written over 200 scholarly articles, and authored or co-authored eight popular books. These include Winter Blues, the New York Times bestseller Transcendence, and the Los Angeles Times bestseller The Gift of Adversity. Rosenthal has conducted numerous clinical trials of medications and alternative treatments, such as Transcendental Meditation for psychiatric disorders, and the treatment of depression with Botox." Watching him on Youtube, it seemed that his South African accent instantly gave him added authority and gravitas (I call this the Salvador Minuchin effect).
B.S. meter: 1 poo. I was ready to be skeptical of Dr. Rosenthal, and this promotional page for his newest book is chock full of celebrity endorsements, including one from Dr. Oz himself. But the book actually seems to offer very sensible advice (based on Dr. Rosenthal's own life) on how to cope with adversity, and reading a passage from it on Google books, I even learned some interesting things about how the NIMH worked during the transition to the Steve Hyman/Tom Insel era. Dr. Rosenthal's research publications also left me impressed. He worked at the NIMH for 2 decades, and he did impactful studies on seasonal affective disorder, sleep disturbance in mania, and the use of light therapy for delayed sleep phase syndrome. He still sees patients in his clinical practice, where he seems to emphasize integrating different treatment modalities instead of pretending there's some magic bullet. And this is my own personal bias, but I find it touching that his son Joshua has followed in his footsteps, becoming a child and adolescent psychiatrist.

So what did I learn about psychiatrists and fame, at least when it comes to Twitter? Obviously, it helps to write multiple best-selling books and to regularly appear on television. Presenting oneself as an "alternative" practitioner with special knowledge or healing techniques helps as well. I won't delve into the content of their tweets in this post, but it seems relentlessly positive messages and pithy tips on how to improve one's life are a must in order to reach as broad an audience as possible.

Also, 3 of the 4 psychiatrists employ the method of following tens of thousands of people in hopes of trying to get as many people to follow them back as possible. In contrast, the truly famous doctors tend have much more sane follow counts: Dr. Oz follows 85, Dr. Pinsky follows 422, and Dr. Gupta follows 198. Thus, Dr. Weiss may well have the most impressive follower count amongst psychiatrists, given that he only follows 25 people for a follower:following ratio of 3228!

Before doing this search, I did not follow any of these top 4 psychiatrists on Twitter. Of the accounts that I follow, 8 of them follow Dr. Amen, 4 follow Dr. Orloff, 4 follow Dr. Rosenthal, and only 1 follows Dr. Weiss (really, @AACAP?). While writing this post, I've decided to follow Norman Rosenthal. He's the one out of the 4 who seems to have most preserved his professional integrity without wading deeply into the realm of pseudoscience, pop spirituality, or utter nonsense. I think every psychiatrist (or doctor, for that matter) aspiring to semi-celebrity status can learn something from him ;-)

Sunday, July 6, 2014

The Limits of Big Data in Psychiatry

While browsing The Atlantic earlier this week, I came across this:


Yes, I was tempted to click on the article involving electric shocks, but it was the ad "Rising Mental Health Issues Facing Our Children, in Five Charts" that caught my attention. The colorful charts show some alarming-looking numbers that most readers of this blog are probably used to seeing by now: that there is a large increase in children receiving mental health diagnoses, that ADHD is diagnosed at very high rates (especially in the South), that children on Medicaid are more likely to receive a mental health diagnosis than children with commercial insurance, etc. The data for the charts were gathered from pediatrician visits across the country by athenahealth (apparently they're too cool for capital letters), the electronic health record (EHR) company who paid for the ad.

They helpfully included a video at the bottom of the ad so we can get "perspectives on these trends from top health care leaders." Take a look:



Watching this video, I was struck by the words of Kurt Newman, M.D., President and CEO of Children's National Medical Center:
"These graphs are just probably the tip of the iceberg. The directional trend is very disturbing, but also the magnitude is disturbing, and these pediatricians are swamped.
[…]
That's why we need to do more research, we need to have a better system in terms of more providers, we need to be able to pay the providers a reasonable amount for the care they're giving. But I think if we do all that, we're going to have a huge impact for these kids and families."
Classic. There's an epidemic on, doctors are swamped—we need more funding so we can provide more treatment! No wonder he's the CEO. And like many other CEOs, he oversells when talking about the future:
"We're on the cusp of something really huge there. It's kind of like big data and big analytics that are gonna really revolutionize how we can identify these trends or get specific about certain diseases […] Autism might be a hundred different rare diseases that are all rolled up into one. We won't figure that out unless we have the analytics, all of the the really sophisticated capability of probing into: is that patient like that patient, is that child like that child, what made them more similar?"
Perhaps I'm too dumb to comprehend big data/analytics, but I fail to see how information mined from an EHR is going to shed light on the etiology of autism. Also featured in the video is Angela Diaz, M.D., Director of the Mt. Sinai Adolescent Health Center, who seems to have a more common sense take on the data:
"We need to figure out what is leading to these kids…30% of U.S. students to feel sad and hopeless for the last 12 months, and of those, 40% of the girls? What is going on? So we need to get to the root causes of these things, and try to identify and then figure out, how to prevent?"
I certainly agree with Dr. Diaz on the importance of trying to determine the root causes of the rising rates of these conditions. However, having the raw data and figuring out causality are two very different things. I would argue that in psychiatry we already have access to tons of data, but unfortunately much of it is interpreted through a very narrow, biologically-oriented lens. Having faster access to bigger pools of data is not going to help. Example in point: the January 2014 JAACAP article that described rising rates of ADHD in the US, which I had previously blogged about. That article was accompanied by an editorial by Drs. Walkup, Stossel, and Rendleman that essentially heralded the findings as good news and a sign that ADHD is being increasingly recognized and treated, which is desirable from a "public health" point of view.

In the June 2014 issue of JAACAP, Dr. Jonathan Posner wrote a very reasonable letter to the editor (subscription required), pointing out that other, more rigorous studies (relying on both parent and teacher report instead of parent report only) have found rates of ADHD closer to 5-7% instead of the 11% reported in the JAACAP article. He concludes that the reason for the rapidly increasing rates of ADHD diagnosis in the community may be "that a substandard approach to diagnosing ADHD has become the norm." Drs. Walkup and Rendleman wrote a reply (subscription required); here's the first paragraph of their response:
Thank you very much for your comments. Your position is one that we believe is shared by many, which is why we wrote the piece. Although we respect your and others’ opinions, we find it difficult to support the statement that rising rates are due largely to substandard assessment of ADHD—it is just too simplistic an explanation. The solution that you allude to is likely not tenable for a high-prevalence condition such as ADHD, because there just aren’t enough child psychiatrist providers to do it all. We are not advocating poor-quality diagnosis or inappropriate treatment; rather, the goal of the editorial was to understand the role of advocacy and education in rising rates, the importance of a public health approach to high-prevalence conditions, and to help child and adolescent psychiatrists come to terms with the fact that our traditional model of care, which is time intensive and highly personalized, is not likely to be able to address the public health burden of ADHD. We certainly do not want to inhibit the pediatric prescriber from taking on the challenge. They need our support to do it well.
So the assumption they make is that cases of ADHD reflect a biological disorder and that increasing awareness of the condition amongst the population, diagnosing it, and treating it with medications is good and proper.

Imagine, for moment, something like this happening with the obesity epidemic. The maps of child obesity in the U.S. look suspiciously like those of the ADHD epidemic, with the highest rates in the deep South. Sure, there are drugs to treat obesity, but would anyone talk with a straight face about a "public health" approach to obesity consisting of identifying the cases and then treating them with medications? Wouldn't a better (and true) public health approach be to ensure that children can get adequate exercise, good nutrition, and that people aren't incentivized to buy the cheap calories and processed "foods" that are making them obese?

Thus, as long as the prevailing view of psychiatric conditions is a narrow one, the data will be used for narrow purposes, such as academic leaders/CEOs arguing for more resources, or to justify the high rates of psychiatric medication prescribing. Here, I'm not even going to get into some other questions I had about the athenahealth ad, including who its intended audience is and what it is trying to achieve. Most doctors recognize that EHRs do not help them care for patients. These systems mostly appeal to large clinics and hospital organizations, for reasons that I will let Dr. George Dawson's recent blog post explain.

Sunday, June 15, 2014

Psychiatry's Low-Tech Advantage

The other day, I received this in the mail:


It's a 57-page booklet/brochure ("bookchure"?) filled with professional photos designed to tug at the heartstrings, minimalist typography, and colorful charts highlighting the awesomeness of Akron Children's Hospital. All I could think of was, "How much money did they waste on this?" Living nowhere near Ohio, I will never have the chance to refer a patient to them. Pages 51-53 list 6 names on their Board of Directors, 26 Directors, 3 Directors Emeritus, and 5 Honorary Directors. This many Directors, I presume, are needed to oversee the 4751 employees and 703 medical staff (p. 50), as well as $1.06 billion in gross patient services revenue (p. 56).

And this wasn't the first such bookchure I've received. I've gotten similar mailings from the Mayo Clinic, the Cleveland Clinic, and probably other places that I've since forgotten. This is what our health care industry has become: Specialty centers who vie for clientele by boasting about the high-tech procedures and treatments that they offer. It reflects a system where about 20% of the population take up 80% of the costs (and even more damning, 5% of people account for 49% of spending).

At its core, Psychiatry is a very low-tech specialty, perhaps the one least reliant upon machines and specialized equipment. That's not to say there's no technology in the field, since knowledge constructs such as CBT are also forms of technology (and let's not forget Big Pharma), but psychiatry today is generally not what anyone would call "high-tech."

The leaders of academic psychiatry and the director of NIMH certainly view the low-tech nature of psychiatry as a huge disadvantage, a travesty that they are doing everything in their power to try to rectify. Hence the ever-greater emphasis on higher-tech ways of studying and manipulating the brain, whether it's optogenetics or connectomes.

However, I view psychiatry's low-tech nature as a huge advantage, at least when that advantage is embraced. A psychiatrist can easily start a practice due to low capital costs and enjoy low overhead since there is no need for a huge support staff. This keeps the focus on the relationship between the doctor and the patient, rather than having some other intermediary like an insurance company or a managed care organization extracting profit. Patients get to spend more time with their psychiatrist, and the psychiatrist has to see fewer patients, resulting in a win-win scenario. Especially if you believe, as I do, that a good therapeutic relationship can lead to positive changes.

Rather than embrace these advantages, the leaders of our profession have done all they can to minimize them, by advancing and supporting a biomedical model of psychiatry where psychiatrists are turned into prescribers doing brief med checks (or into consultants to other doctors). Since drugs are one of the few high-tech (and expensive) things in psychiatry, this of course serves the interests of pharmaceutical companies and the researchers that they support.

Last week, 1 Boring Old Man wrote about new APA President Paul Summergrad's plea for psychiatrists "to put aside internecine battles":
What [Summergrad's] predecessors have failed to notice is that a growing number of psychiatrists refuse to operate in the world created for them by Managed Care and insurance reimbursement, and that’s not all about money. […] A lot of it has to do with being unwilling to have practice dictated by excel spreadsheets in the offices of bureaucrats, the marketing departments of a corrupt industry, or the moguls of the APA and NIMH. Many avoid the APA like a plague. And many who still work in that system would be glad for a chance to change it into something more compatible with the real reasons they chose this specialty in the first place.
I really like the above paragraph from 1BOM since it captures the essence of the problems within our profession, but I would say that it's very hard to be a part of "that system" without being subject to general economic trends affecting all of healthcare. Most other specialities are not quite as low-tech as psychiatry, but the ones that rely on talking to patients and examining them using very basic equipment, such as internal medicine and pediatrics, certainly have similar dynamics.

With all that said, I am by no means anti-technology, as long as the technology is serving the patient. For example, a recent San Francisco Chronicle article highlights one entrepreneur's efforts to create "a website for a health care model in which members pay monthly fees for primary care." If that works, it would help remove primary care physicians from the grind of being in the current insurance reimbursement-based system, which has led to high rates of burnout. Also promising are the health initiatives of companies like Apple, which have the potential to empower individuals to keep better track of their own health (and allow doctors easier access to that information), which hopefully will someday decrease society's reliance on the high-tech specialty hospitals with their fancy publicity materials.

Sunday, September 22, 2013

Louis C.K., Mindfulness Guru?

Note: The last couple of months have been very busy for me, so I apologize for the infrequency of posts. Now that things have gotten back to normal, I hope to resume posting weekly.

Louis C.K.'s recent appearance on Conan has already been linked to on multiple sites, with most of the headlines reading something like "Louis C.K. on why kids shouldn't have smartphones." Check out the video below if you haven't yet see it:



C.K. is one of my favorite comedians, and this clip shows why. Like many comedians, he often says things that people are thinking but are too afraid to say themselves. Here, he puts a voice to many things that I as a child psychiatrist would love to say to parents, but have a hard time finding a diplomatic way to do so.

To me, what he said is not about "hating cellphones" or "kids shouldn't have cell phones." His riff is much broader than that. He starts out talking about parenting, and how parents give in to their kids and get them phones because "all the other kids have the terrible things." Of course, this happened long before cell phones became common, and gets to the heart of how much trouble parents have in setting appropriate limits because they are afraid of momentarily making their child sad or mad. However, if a parent doesn't teach his or her child how to handle being being disappointed or told "no," then who is? Why not "let your kid go and be a better example to the other [bleeping] kids," as Louis C.K. says?

He then talks about how face-to-face interactions can help build empathy, but when a child engages in cyber-bullying, he or she does not get the feedback of seeing the other child's expression turn to sadness, and instead "when they write 'you're fat', then they just go mmm..that was fun, I like that."

Next, C.K. gets to the heart of what mindfulness is about to me. "You need to build an ability to just be yourself and not be doing something. That’s what the phones are taking away. The ability to just sit there, like this. That’s being a person." I would add that of course, the ability to just sit and tolerate being yourself was already difficult before smartphones became ubiquitous, with a 2006 Kaiser Family Foundation report showing that American youth spent almost 4 hours a day watching TV/videos, over 1.5 hours listening to music, about 1 hour on a computer, and almost another hour playing video games, with many of these activities happening simultaneously. Let's not forget all the other mindless ways of distraction other than smartphones.

C.K. even ventured into existentialism, how "underneath everything in your life, there's that thing, that forever empty…that knowledge that it's all for nothing, and that you're alone." He dares to utter the truth, long known to Buddhists, that "life is tremendously sad, just by being in it." He adds, "That's why we text and drive, pretty much 100% of people who are driving are texting…people are willing to risk taking a life and ruining their own cause they don't want to be alone for a second."

Lastly, Louis shared a story about how he was driving one day, and a Bruce Springsteen song came on that made him feel really sad. Instead of avoiding his sad feelings by texting people, "I pulled over, and I just cried…so much, and it was beautiful…sadness is poetic, you're lucky to live sad moments…I was grateful to feel sad, and then I met it with true, profound happiness." His overall message is one that I try to tell patients all the time. They often tell me that they don't let themselves feel sadness or grief, because they're afraid of feeling overwhelmed. However, attempts to suppress those sad feelings just get in the way of a person truly being content with life. As C.K. said, "Because we don't want the first bit of sad, we push it away...and you never feel completely sad or completely happy, you just feel kinda satisfied with your products, and then you die."

Despite the jokiness of the delivery, Louis C.K.'s message is quite serious and well thought-out. I hope everyone listens.

Monday, June 24, 2013

What to Do if Your Kids Are Obsessed with Technology

I clicked on author Steve Almond's piece in yesterday's New York Times Magazine fully expecting to roll my eyes at yet another alarmist screed about how electronic devices are destroying childhood. However, after reading (and re-reading) it, I came away mostly impressed. I think he made many salient points about the challenges of parenting in the touch-screen era, which I would like to explore some more.

Look in the Mirror

One of the most important influences on how children interact with technology is the example set by their parents. Many parents take the approach of "do as I say, not as I do," which almost never works. Here, Almond does a good job of self-examination:
[...] But even without a TV or smartphones, our household can feel dominated by computers, especially because I and my wife (also a writer) work at home. We stare into our screens for hours at a stretch, working and just as often distracting ourselves from work.

Our children not only pick up on this fraught dynamic; they re-enact it.
He also recognizes when he is using technology as an easy pacifier:
After all, we park the kiddos in front of SpongeBob because it’s convenient for us, not good for them. (“Quiet time,” we call it. Let’s please not dwell on how sad and perverse this phrase is.) We make this bargain every day, even though our kids are often restless and irritable afterward.
That he views this strategy as one of his "failings as a parent" is a bit harsh. Almost all parents do this at least some of the time. Unfortunately, what he does not discuss in detail is just what his relationship is like with his children. That is the critical piece. If he is having meaningful conversations or one-on-one play time with his children, or if he is helping to get them involved in a variety of activities, then he is probably not failing as a parent.

Set Limits, Maintain Balance

The American Academy of Pediatrics recommends the following: "Children and teens should engage with entertainment media for no more than one or two hours per day, and that should be high-quality content. It is important for kids to spend time on outdoor play, reading, hobbies, and using their imaginations in free play." The AAP also recommends that children under age 2 not be exposed at all to television and other entertainment media. It's best to start implementing rules around technology use early on; waiting until a child becomes a teenager is way too late. Almond tries to set some appropriate limits for his children:
[...] We ostensibly limit Josie (age 6) and Judah (age 4) to 45 minutes of screen time per day. But they find ways to get more: hunkering down with the videos Josie takes on her camera, sweet-talking the grandparents and so on. The temptations have only multiplied as they move out into a world saturated by technology.

Consider an incident that has come to be known in my household as the Leapster Imbroglio. For those unfamiliar with the Leapster, it is a “learning game system” aimed at 4-to-9-year-olds. Josie has wanted one for more than a year. “My two best friends have a Leapster and I don’t,” she sobbed to her mother recently. “I feel like a loser!”
He is certainly right about just how much various devices have become a seemingly vital part of children's lives; it is unrealistic to think that any child can be immune from their allure. In my mind, an important task for parents is to help their children learn how to use technology without being consumed by it. Setting appropriate limits and having a plethora of other activities for the child to engage in helps this learning process. It sounds from this anecdote that despite his daughter's heart-wrenching words, she did not end up getting a Leapster. Perhaps she was able to learn a small lesson here, that her life will go on even if she does not have the same shiny thing as everyone else.

Be Aware of Family-of-Origin Issues

When it comes to parents' attitudes about raising their children, it's always interesting to see how some parents recreate a similar dynamic with their children as the one they had with their own parents. Others go to the opposite extreme: if their own parents were too harsh, then they might be too permissive with their own children. Thus, one of the most interesting paragraphs hints at the author's own relationship with his parents:
My brothers and I were so devoted to television as kids that we created an entire lexicon around it. The brother who turned on the TV, and thus controlled the channel being watched, was said to “emanate.” I didn’t even know what “emanate” meant. It just sounded like the right verb.

This was back in the ’70s. We were latchkey kids living on the brink of a brave new world. In a few short years, we’d hurtled from the miraculous calculator (turn it over to spell out “boobs”!) to arcades filled with strobing amusements. I was one of those guys who spent every spare quarter mastering Asteroids and Defender, who found in video games a reliable short-term cure for the loneliness and competitive anxiety that plagued me. [...]
Later, when Almond talks about seeing his children drawn to electronic games and cartoons, he wrote: "I’m really seeing myself as a kid — anxious, needy for love but willing to settle for electronic distraction to soothe my nerves or hold tedium at bay." I can't help but wonder how the approach his parents took to child-rearing might have influenced his anxiety and loneliness. I did find it curious that he wrote his daughter's "job is to make the same sometimes-impulsive decisions I made as a kid (and teenager and young adult). And my job is to let her learn her own lessons rather than imposing mine on her." However, his actions seem to indicate otherwise: he is much more active than his own parents were in setting appropriate limits around his children's technology use. There is nothing wrong with parents imparting lessons learned in their 20's to their own children, if those lessons are about not letting technology rule one's life.

Understanding the Purpose of Technology

Of course, not all uses of technology are equal. A child could be using an iPad to learn how to read, draw, or even program. Alternatively, a child could be playing mindless games nonstop. The distinction is crucial, so parents need to know how their children are spending their time on these devices. While Almond acknowledges that iPads may be good educational tools when used effectively by good educators, he raises the following concerns:
The reason people turn to screens hasn’t changed much over the years. They remain mirrors that reflect a species in retreat from the burdens of modern consciousness, from boredom and isolation and helplessness.

It’s natural for children to seek out a powerful tool to banish these feelings. But the only reliable antidote to such burdens, based on my own experience, is not immersion in brighter and mightier screens but the capacity to slow our minds and pay sustained attention to the world around us. This is how all of us — whether artists or scientists or kindergartners — find beauty and meaning in the unceasing rush of experience.
If a person mainly uses a screen device to banish unpleasant feelings, then that is indeed very unfortunate. I do agree with Almond's emphasis on the importance of children learning about the real physical world that surrounds them. I would add that it's important that they learn about their own inner world of thoughts and feelings as well, so that when they inevitably experience anxiety or sadness or boredom, they do not automatically seek to banish it with a screen of some sort.

I once tweeted:

If I could have a do-over, instead of "do nothing" I would say: "I wonder if all these children raised on touch-screens can keep themselves occupied without one?" Almond ends the essay by writing about how his daughter is able to sit for five minutes while waiting for a cardinal to visit their family's compost bin and his hope that she does not forget the wonders of the real world. I think there's reason to be optimistic, despite the very pessimistic title of the article: "My Kids Are Obsessed With Technology, and It’s All My Fault." I'd like to say to Mr. Almond, it's not your fault. Most kids are obsessed with technology. If they were obsessed and you allowed them to spend all their time in front of a screen, then it's your fault.

Saturday, May 11, 2013

What If the NIMH Succeeds? What Then?

Ever since National Institute of Mental Health (NIMH) Director Thomas Insel wrote his Transforming Diagnosis article on how the NIMH is moving away from the DSM to a new system called the Research Domain Criteria (RDoC) for future research studies, there have been countless articles and blog posts written about what this may mean for the future of mental health.

One of the most insightful perspectives comes from 1 Boring Old Man, who points out that the NIMH is trying to do the same thing as the DSM-5 all over again by focusing on "biology, genetics and neuroscience so that scientists can define disorders by their causes, rather than their symptoms" [quote is from the NYTimes article]. Dr. Allen Frances, the Chair of DSM-IV, thinks the new NIMH approach has merit, but he strongly criticized the NIMH for over-promising advances that won't arrive for a very long time, while ignoring the present plight of the chronically mentally ill.

Neuroskeptic likened the controversy to the Protestant Reformation, with the NIMH's RDoC (Protestantism) rising to rival the DSM approach (Catholicism), but in the end they worship the same God (biological psychiatry). This focus on the biological basis of mental illness troubles me, since I think it is terribly limiting. So much of a person's well-being is dependent on relational aspects and influenced by culture and society, as the Child in Mind blog pointed out. According to NIMH's mission statement:
The mission of NIMH is to transform the understanding and treatment of mental illnesses through basic and clinical research, paving the way for prevention, recovery, and cure. For the Institute to continue fulfilling this vital public health mission, it must foster innovative thinking and ensure that a full array of novel scientific perspectives are used to further discovery in the evolving science of brain, behavior, and experience. In this way, breakthroughs in science can become breakthroughs for all people with mental illnesses.
Though this statement does not explicitly constrain the NIMH to only fund studies looking at the biological aspects of mental illness, the language of "curing" someone obviously reflects a biological perspective. Dr. Insel is a quite a True Believer in the premises behind biological psychiatry, as shown in his TEDxCaltech talk:
“If we waited for the ‘heart attack,’ we would be sacrificing 1.1 million lives every year in this country,” he said. “That is precisely what we do today when we decide that everyone with one of these brain disorders, brain circuit disorders, has a behavior disorder. We wait until the behavior emerges. That’s not early detection, that’s not early prevention.”
As ridiculous as the above position sounds to me, let me play devil's advocate and future-think. Suppose the NIMH succeeds beyond anyone's—even Dr. Insel's—wildest dreams. What if their biological paradigm is able to elucidate at the brain circuit level (including all the circuits for positive/negative emotional valence, cognition, social processes, and attention/arousal) exactly what is happening when a person is depressed, or anxious, or hallucinating, and technology advances enough so that treatments can directly target those dysfunctional brain circuits, what then?

Well first, to make the diagnosis, there will have to be some kind of brain imaging examining the circuitry, likely coupled with obtaining a person's genetic profile. Given the complicated wiring in the brain, this will have to be done by a computer instead of a human. Treatments clearly won't be like today's medications that just target a receptor or set of receptors. To target a circuit, I can envision several methods: 1) The circuit would either have to be ablated using precise neurosurgery or interventional neuroradiology; 2) Some kind of medication would be used in conjunction with a device outside the brain that allows the medication to become active only in certain targeted areas of the brain; 3) Some sort of nanotechnology with tiny smart robots inside the brain reprogramming circuits. Because the brain is so plastic and easily influenced by the environment, a person will likely need repeated procedures or continuous treatment to prevent the circuitry from reverting to its previous state. And we haven't even talked about prevention, which seems to be Dr. Insel's goal. To do that, everyone would have to get brain-scanned on a regular basis and genotyped.

Certainly, new technologies will come along that I can't even imagine today. However, none of this will be cheap. Even certain cancer drugs today (which aren't that high-tech in the grand scheme of things) can cost hundreds of thousands of dollars per year. So once those super-expensive new brain treatments come out, who will get them? As we've seen with cancer treatment, rich people like Steve Jobs can get genotype-specific treatments and out-of-state liver transplants that ordinary folks cannot afford. Thus, it's hard for me to envision these advances in understanding brain circuitry doing much, if anything, for "public health."

Even trickier are the ethical issues that these new advances would pose for society. If you can correct the circuits causing a person's cognitive dysfunction and hallucinations, then you can certainly damage them as well. Who would we trust with such technology? Pharma? The government? China already locks up dissidents in mental hospitals; imagine if the Chinese authorities could rewire the circuits contributing to a person's desire to protest injustice. And what would happen if we no longer need any human contact, sunshine, exercise, or purpose in life to ward off depression or anxiety? Would we be content to live like the oblivious human batteries in The Matrix?

Before you accuse me of being a nutty conspiratorialist, consider this: If I were to time travel to the 1960's, and I told people that in 50 years time, everyone would have pocket-sized devices that would combine the functionality of TV, radio, telephone, telegraph, camera, newspapers, magazines, books, and myriad other games and diversions; that no one has to remember anything anymore because they can just ask an entity called "Google"; and that people would stare at this device for hours a day, even during social situations like group dinners, I think they would have put me in a psychiatric hospital.

Obviously, not much of what I am saying is new or original. Many science fiction authors have imagined such a dystopia. You can argue that it's the not the NIMH's job to consider all the potential consequences decades or centuries away, and you may be right. But I will say this: The risks of biological psychiatry are great, with uncertain payoffs. Directing those billions of dollars to address issues like transgenerational poverty, child abuse/neglect, interpersonal violence, and the housing of mentally ill in jails and prisons, while boring, will almost certainly reduce the burden of mental illness and help make our society a better place.