TBT: Of monsters and developmental mysteries

Once upon a time, I applied for an internship at my local public radio station. While I didn’t end up getting it, I sent them this writing sample that gives an informal explanation of developmental biology, which I stumbled across today. Since I am a freshly-minted PhD as of this month (yay!), and still love development after devoting most of the last 10 years to studying it, I feel that my thoughts on development from several years ago still apply. So, TBT, here’s my waxing poetic on the underpinnings of all multicellular organisms:

Few things are as fascinating as creatures mistakenly born with a single central eye, or those with too few or too many fingers on each hand. How did they get to be that way? When things go so drastically wrong, what is the culprit, and should we care? It would be easy to dismiss such anomalies as freaks of nature. After all, they are exceedingly rare. But in the study of developmental biology these monsters are the key to understanding why most living things aren’t that way, and how they form correctly. The mistakes show us not only who we are, but why we are.

Every living thing, whether it’s plant, animal or fungus, has an instruction manual that tells how to make more of it. This manual, known as the genome, contains the code for every building material, every map of where things go, and every step needed to make and maintain a creature. The manual has a section (called a gene) on how to make a molecule named Sonic Hedgehog, which looks nothing like the video game character. However, this molecule and the instructions telling where to make it are crucially responsible for ensuring that a creature has two eyes on its face and five digits on its hand. Without Sonic Hedgehog, creatures are born cyclopic, with a single central eye, or with only one digit on each limb. Change the instruction manual and you’ll change how or what gets made. These changes, called mutations, are often disastrous. But every so often a creature is born with a mutation that helps it, gives it a little superpower over its peers, so that it’s a little more successful at reproducing. When this happens the mutation is passed along to the progeny of the original mutant, and those progeny are more successful than their peers, and so on until the mutation becomes present in every member of the species, or until a new species entirely breaks off on its own. Mutation, the anarchy of the genome, is responsible for the incredible diversity among living things.

Developmental biologists love firstly to break things, and secondly to hijack them. We use mutation, nature’s own act of rebellion and innovation, to control development so that we can study it in animals like mice and chickens. Remember how the instruction manual includes a map for where things go? This applies to Sonic Hedgehog. By destroying, via mutation, the part of the manual that says where to put Sonic Hedgehog in the growing limb, we can block the growth of digits there. On the other hand (no pun intended), if we duplicate the map for where to put Sonic Hedgehog and put the copy on the other side of the limb, we can cause extra digits to sprout in mirror image to the original ones. By creating this controlled disorder developmental biologists may be making monsters, but the monsters let us answer the deepest questions about ourselves.

By looking for the misfits, by breaking the system in myriad ways, developmental biologists seek to understand. We study chaos so that we can see what order is supposed to look like, and so that we can recognize and fight chaos effectively when it knocks on our doors in the form of diseases like cancer. And sometimes we study chaos simply because we’re drawn to it, because it’s a fascinating corruption of ourselves. We study mutants because that’s what we are.

 

Featured image: my favorite model organism.

Let the kids vote.

I think it’s time to lower the voting age to 16.

This is something I’ve felt strongly about since I was in high school and bummed that I couldn’t vote in the 2004 presidential election (my parents can attest to me loudly voicing my frustration on this matter). I REALLY wanted to vote in high school. I registered way before I turned 18, even though I knew I’d have to re-register with a new address before an election I could vote in actually came up. Hell, I was so annoyed that I missed being able to vote in the 2006 midterms by a few months, when it looked like my district might finally swing to the left and we could boot our corrupt Republican Congressman, that I tried to convince my apathetic 18-year-old friend to register and let me “advise” her on how to fill out her ballot. Yes, I know that is illegal, but dammit, I was more politically aware than plenty of adults, and it seemed unfair and nonsensical that they had this right and I didn’t. I even wandered into my precinct’s polling place and casually asked them if I could fill out a ballot (spoiler: they didn’t let me). I was really excited to be part of the democratic process and I went out of my way to inform myself on the issues. Though most of my peers probably didn’t feel the same way, I don’t think that apathy is inevitable. I think it’s totally possible for teenagers to get interested and informed about politics, and to be effective— just look at the political action spearheaded by teenagers in the wake of the Parkland massacre. We should respect them by giving them the right to vote. I can think of plenty of good reasons why 16- and 17-year-olds should be able to vote, and no good reasons why they shouldn’t (though a number of arguments have been made against giving them the vote, and I’m open to hearing and debating about others). Let’s delve into it.

The maturity question

The main argument against lowering the voting age is generally that kids are too immature or don’t have enough life experience to make informed voting decisions. This is condescending and just plain false. First of all, it’s essentially saying, “Hey teenagers, you’re dumb and make bad decisions, so you can’t possibly be capable of thinking intelligently about important issues.” While it may be true that people’s brains keep developing into their mid-20s, we don’t consider that a good reason to keep anyone under 25 from voting, nor do we take voting rights away from people with demonstrable cognitive deficiencies. As long as you are older than 18 and not a felon, you can vote even if you have dementia, or are blackout drunk, or believe that you are second coming of Christ, or think that Hillary Clinton was running a child sex trafficking ring out of a pizza parlor. We don’t apply an intelligence or critical thought test to anyone over 18 before giving them the right to vote, and yet we automatically assume that people under 18 aren’t cognitively advanced enough to handle the issues being voted on. Give me a break, and give kids some credit. They have valuable things to say, and many have done more to further an intelligent civic discourse than most adults.

Furthermore, I doubt there’s much difference in the cognitive abilities of someone who is 18 versus someone who is 17.5, and yet the 18-year-old gets to vote while the 17.5-year-old doesn’t. The voting age is kind of arbitrary in this way. The counterargument to that would be that we have to set the cutoff somewhere, but by setting it at 18, we’re disenfranchising people who are fully capable of making informed voting decisions. The type of cognitive skills needed for voting are known as cold cognition, and they are established before a person turns 16. These are the skills that allow people to make rational decisions based on assessing evidence before coming to a conclusion, without being influenced by emotion. Teenagers can absolutely do this, and they can absolutely use those skills to make responsible choices at the ballot box. In fact, in Austria, one of the only countries to allow 16- and 17-year-olds to vote, research shows that youth voters make voting decisions that are similar to those of people over 18. So no, they aren’t irrational and impulsive at the ballot box. When it comes to their voting choices, they’re just like all of us supposedly older and wiser voters.

Making the political process belong to them

Allowing 16- and 17-year-olds to vote would get them engaged in the political process early. Right now for a lot of teenagers, it probably feels pointless to pay attention to politics (or government matters in general) because they have no say in it, and it seems like something not worth worrying about until they are considered adults. They don’t feel any ownership of it because they’re excluded from participating in it. But political decisions affect everyone in this country, including those too young to vote. Congress kowtowing to the NRA and refusing to pass meaningful gun control has allowed kids to be massacred while trying to get an education. Even if those kids want to vote out the politicians who sit on their hands while angry white men/boys shoot up schools, something that directly impacts kids in a literal life-or-death way, they can’t do it. But if they were allowed to vote, they could feel empowered to make great changes in society and work to ensure their own safety (which they shouldn’t have to do, but that’s what things have come to). They could feel like civic engagement was another milestone to reach in adolescence, as opposed to an opt-in burden to take on as an adult or something that doesn’t belong to them. Let’s give kids a better reason to get engaged than having to fear for their lives due to school shootings.

Increasing voter turnout and easing the transition to civic engagement

We have a problem with voter turnout in the US. Youth voter turnout is particularly low; 46% of eligible voters aged 18-29 voted in the 2016 presidential election, and turnout is even lower during midterm elections. Of course, this doesn’t help with the stereotype that young people are uninformed and apathetic. Facilitating voting in schools would be a great way to easily bring young people into the voting process. On election day, there could be polling sites at high schools, and students 16 and older could be granted time to go and cast their ballots. Perhaps teachers could even give extra credit to students who vote, proven by bringing back a ballot stub to class after going to the polls (their peers who are younger than 16 could also receive extra credit for learning about the issues and voting in a mock election, to make things equitable). Nonpartisan information about the issues could be distributed to students in class (the same information that gets mailed to adult voters). I think that getting people started voting early, so they see what it’s like and get in the habit of learning about the issues and candidates, will set them up to keep voting later in life when they have to be independently responsible for it.

Right now we don’t do much to make teenagers feel any ownership of voting or political engagement, but then we expect them to immediately become independently engaged the moment they turn 18. Some adults even have the gall to chastise young people who don’t vote and blame them for unfavorable political outcomes. Fuck off with that argument. We aren’t setting teenagers up to vote easily or to care about civic engagement, when we definitely could. Why not help them? Obviously this won’t solve all of the voter turnout problems in the US, but I believe that supporting people in civic engagement early on would do a lot to motivate people down the road.

No taxation without representation

16-year-olds can work and pay taxes, and yet they have no say in what their tax money is spent on. This can definitely affect them directly: they have to pay into a system, but if their Congressperson wants to defund public schools or take away regulations on student loans meant to protect the borrower, they can’t do anything about it other than suffer the consequences. This is taxation without representation, which is why the US became an independent nation in the first place. And being taxed on their income isn’t the only thing teenagers can be compelled to do without having any kind of say. Teenagers can be tried as adults in court, seemingly acknowledging that they’re capable of adult-level cognition as applied to criminal activity, but for some reason not to civic engagement. They (particularly teenage girls) can get married before the age of 18 as long as their parents “consent”, which often equates to parents forcing their daughter to marry her abuser. In the eyes of the law, she is old enough to get married and deal with all the responsibilities and ramifications associated with that, but she is not old enough to make an informed decision to vote against policies and lawmakers who support child marriage. How backwards is that? If we can force teenagers to do these things, we are obligated to give them a say in the matter.

I think this issue has more momentum now than ever before. The students from Marjory Stoneman Douglas High School have shown us repeatedly that teenagers can be public voices for intelligent expression of the issues that affect society. They are not impulsive, or irrational, or incapable of understanding problems at the same level as adults. This isn’t a new phenomenon. Teenagers have always been capable of being informed, civic-minded voters, but adults have held them back. It’s time now that we empower them to take their rightful place as participants in our democracy.

 

Featured image: the Indonesian embassy in Washington, D.C.

It’s okay to admit when you’re wrong.

I’m wrong sometimes. I’ve been wrong about a lot of things.

One clear example: when I started this blog in January 2017, I thought it was a reasonable expectation to write a new post once a week. For a lot of bloggers that’s definitely doable. But I underestimated how much time and mental energy would be taken up by my research, with two new projects going at the same time and exciting data rolling in and getting me hooked on bench science (again), and with my last semester as Art Director of the Berkeley Science Review bringing responsibilities and a chance to put my often-dormant art skills to use (see: the cover for the spring 2017 issue). I also started blogging for the BSR, which took up what little bandwidth for writing I had. I wasn’t slacking off, but I still felt guilty for neglecting my blog, mostly because I had made an unrealistic promise to myself to write more than I could. I’m not trying to make excuses for myself, but more to be honest about the reality of the situation so that I won’t feel guilty when I inevitably blog less often than I aspire to, and so I can have more realistic expectations in the future. And this is okay.

I’m going to be an armchair psychologist for a second. People often feel uncomfortable being wrong, to the point where sometimes when presented with solid evidence disproving our claims, we’ll dig our heels in deeper. I wonder if this is just human nature, or if this has to do with people never learning to be comfortable with being wrong sometimes. When we realize we’re wrong, or when someone calls us out on it, we react as if it’s a personal attack as opposed to an objective fact that has nothing to do with our personal character. Or people see being visibly wrong as demonstrating that we are unintelligent, or careless, or incompetent: all things that could affect others’ perception of us, and people in general don’t want to be disliked or shunned by others (I think very few people truly give zero fucks about what others think about them, much as some of us try to behave that way).

A little bit of worrying about what others think about us is fine; it’s what allows us not to socially alienate other people and helps make us empathetic. It’s also good, and admirable, to stand firm on our beliefs when we have a strong backing for them. The problem arises when we ignore or dismiss evidence that we are wrong and continue making decisions based on our false views.

The way white Americans in general view and internalize our country’s history of mass atrocities against racial minorities is a good example of this. How many of us feel, and I mean really feel, an ever-present sense of guilt over the way black Americans were first enslaved, and then discriminated against and brutalized during the Jim Crow era? Or how Native Americans had their land seized and were forced on a death march on which thousands of people died? How many of us, thinking back to U.S. history class in high school, involuntarily felt their stomach knot in disgust when learning about these things, because we were instilled with the idea that these atrocities were, and still are, a national responsibility to be shouldered by every American? Do we constantly think about how the historical fact of these hideous acts continues to affect our society now, and how exactly it contributes to the challenges faced by African American and Native American communities? No, most white Americans probably don’t. I’m willing to bet most of us learned about these things and thought, “Well, that’s terrible,” and on a deeper level, “It’s in the past and it’s not my problem”. Many of us were probably even taught a watered-down version of these events, or worse, a very inaccurate one.

Contrast this to how things are in post-World War II Germany*: people learn about the Holocaust multiple times in school, in great detail, with the intent to accept what happened but not forget it, and to learn from it. There’s some level of national collective guilt. It’s not that Germans whose grandparents fought in World War II should feel as though they are personally responsible for the actions of the Nazis, but rather that they are responsible for remembering what happened, recognizing how terrible it was, and preventing something that horrific from happening again. We just don’t do that in the U.S. We acknowledge that mistreatment of racial minorities was bad, but we then distance ourselves from it and don’t learn about it extensively, shrouding ourselves in a protective blanket of ignorance. We don’t fully take on the historical responsibility of being wrong. And because of this, an undercurrent of racism has been allowed to flourish.

In current U.S. politics, admitting you’re wrong is often seen as a sign of weakness. Politicians won’t admit they’re wrong unless it’s so glaringly obvious that people have figured it out without them even needing to admit it. They do this because if they do admit they’re wrong, they face serious backlash; their constituents lose faith in them, assuming they have no deeply held principles, or can’t be decisive, or that they aren’t loyal. This has to stop. We need to be more accepting of people being wrong, as long as they acknowledge it and try to make it right. Otherwise there’s no incentive for people to correct their mistakes, and all the more reason for them to dig their heels in and rationalize sticking to information that’s obviously wrong, or ignore information counter to their incorrect beliefs. We need to place more value on honesty, as opposed to stubborn refusal to entertain solid ideas counter to your own. We need to admit that we aren’t perfect, and that that’s okay, as long as we’re consistently trying to better ourselves.

 

~~

*I am not an expert on how Germans handle teaching of the Holocaust and have no personal experience with it, so I’m basing my opinion on what I’ve heard from the German people I’ve asked about this issue and some Googling. I don’t mean to put Germany on a pedestal; rather I think this is a good example of doing the right thing.

 

Featured image: A very big newt in Lake County.

 

 

New BSR blog: The Moral Responsibility of Genome Editing

Hey all, I’ve been busy with so many things over the last few months, one of which was writing this blog post for the Berkeley Science Review on the ethics of genome editing. We’re approaching an era where we can relatively easily edit genes in the cultured cells of sick patients, or in patients’ cells within their bodies, or even in early embryos— this brings up a whole host of ethical issues to consider. Check out the article for a discussion of these issues!

This is us.

Perhaps the “this is not what American values represent” attitude towards racial violence is aspirational; people want to believe that our country has reached a level of racial equality and peace such that it defines our society. At best, that involves a hope that we have overcome previous societal racism, with a vague memory of what that racism actually entailed. At worst, it involves denial that societal racism was ever really that bad, or defensive justification of it. But to respond to incidents like Charlottesville by saying “this isn’t us” ignores the stain of racial violence on American history, as well as how that history continues to impact our society.
 
This IS us, and it’s been us for centuries. “American values” used to represent enslavement of black people, and unchecked violence against them for generations after emancipation. “American values” used to represent denial of basic citizenship rights that white people take for granted, like the ability to vote or go to school, to choose where to own or rent property, or congregate freely in public places. One could justifiably argue that “American values” still effectively represent these things, and that it just looks slightly different than it did 100 years ago. What America has not done, unlike other nations that have committed large-scale atrocities, is admit what we did wrong and instill in our society a level of collective guilt about it.
 
Yes, it’s uncomfortable to own up to this history, but it’s what we have to do. Racism against black people is woven tightly into the legacy of America and nothing will change that part of our past. However, we can work to change its impact on our future as long as we recognize this fact. We cannot truly address the problem if we deny its existence.
(Originally written as thoughts in response to this article from The Atlantic.)

Hatred and bigotry: the Republican legacy.

I’m a little late in sharing this, but if you haven’t seen it yet, Vice News created a video that is an incredible look into the far right white supremacists mobilized by Trump who sparked the violence in Charlottesville, and seek to do so elsewhere. It’s incredibly disturbing, to put it mildly.
I am Jewish. My whole family is Jewish. The men shown in this video don’t know me or anything about me, but they state here that their ideal America is one that is “cleansed” of me and people like me. If you care about me, or any other Jewish person or person of color or queer person, this should chill you to the bone. If you are a decent human being, this should make you ill.
These white supremacists have been courted by the Republican party for years, and the GOP has relied on their support while simultaneously pretending not to encourage them. Now we’re seeing the results of this Republican strategy: a president who campaigned on racial hatred, who egged on his violent far right supporters and relied on them to win, who refuses to strongly denounce these white supremacists and instead insists that many of them– many of the neo Nazis, the KKK, the people who thought that running over nonviolent counterprotesters with a car was justified– are “fine people”. A president who supports the racial profiling and inhumane treatment perpetrated by Joe Arpaio so much that he pardoned him. Emboldened white supremacists who seek to terrorize others. A majority of Republicans in Congress, sitting on their hands and doing nothing to help, but talking a lot about how “troubled” they are. They are the ones in the position to do something, and many of them recognize how wrong this overt bigotry is and that something should be done, but they are silent. Cowards.
This hatred and bigotry is the Republican legacy. People who voted Republican, and especially those who voted for Trump, voted for this. Even if you didn’t approve of Trump’s racist rhetoric and voted for him for other reasons, you tacitly approved this. It’s now your responsibility to stand against the white supremacists and far-right bigots, and to call Trump out for supporting them. If you remain silent, you are continuing your approval of their actions. You’re saying, “This racist terrorism is fine by me.” That makes you a coward too; and if you don’t believe you are a coward, you should ask yourself just what kind of person you truly are.

Weaponizing the scientific method

We’re experiencing an unprecedented level of lying in American politics right now, mostly courtesy of our “so-called” president. What is there to do about this? Right now it seems like no amount of calling out bullshit stops Trump from baldly lying, or his supporters from accepting the lies, or congressional Republicans from spinelessly hiding in a corner pretending it’s not happening. Trump is doubling down on his lies and lashing out at the media for exposing him, which threatens our free press and our ability to fight back against his wannabe-authoritarian regime. I’ve talked about how hopeless this makes everything seem before, but also how we don’t have much of a choice except to fight back. We are still scrambling to figure out the best way to fight.

As a scientist and someone who prefers to rely on logic to make decisions, I believe that one way to do this is by teaching people to get comfortable using the scientific method all the time. It’s not really that hard once you train yourself to think that way and apply it to all kinds of situations, even those that occur outside of the lab. So what is the scientific method? In a nutshell, it’s a set of principles used to conduct evidence-based inquiry. It’s founded in the idea that you can make an observation (“X phenomenon occurs”), extrapolate a hypothesis to explain the observation (“Y is acting on Z to cause X phenomenon”), and then do tests to find out if your hypothesis is correct or not (“If I remove Y, does X phenomenon still occur?”). A good scientist will form multiple possible hypotheses, including null hypotheses (“Y is not acting on Z to cause X” or “Y and Z have nothing to do with X”), and set up tests that will generate observations to address each hypothesis. At the root of the scientific method is thinking of multiple possible scenarios and applying skepticism to all of them, as opposed to just accepting the first one you think of without questioning it further. Oftentimes you come up with a hypothesis that makes logical sense and is the simplest explanation for a phenomenon (i.e. it’s parsimonious), but when you investigate it further you find that it is not at all correct.

Let’s apply this to politics today. Here’s a basic example: Trump lost the popular vote by nearly 2.9 million votes, but he keeps claiming that this is due to massive voter fraud where millions of people voted illegally. You could just take his claim at face value and not question it. After all, he is the president, so he’s privy to lots of information the general public doesn’t have, and as the leader of the U.S. he should be acting with integrity, right? You could use this logic to form the hypothesis that he is correct and there is evidence of massive voter fraud. Or you could apply a bit of skepticism and formulate an alternative hypothesis: Trump’s claim is false, and there’s no evidence of the massive voter fraud that he cites. How to test this? You can look for the evidence that he claims exists just by Googling… and you won’t actually find any. What you will find are instances of his surrogates claiming that the voter fraud is an established fact, various reputable news agencies and fact checkers debunking the claim (even right-leaning Fox News admits there is no evidence for the claim), and a complete dearth of any official reports that massive voter fraud occurred (which I would link to, but I can’t link to something that doesn’t exist). In support of the claim you will find right-wing conspiracy websites like InfoWars that don’t cite any actual evidence. So based on that inquiry, you could conclude that Trump’s claim is false. It didn’t take much skepticism or effort to address the question, just enough to ask “Can I easily find any solid evidence of this?”

One problem with this strategy: we aren’t doing a great job at teaching people how to think rationally and critically. On a large scale, we would do this by refocusing our education standards around critical thinking (part of what Common Core and Next Generation Science Standards aim to do, albeit with mixed results over the methods and implementation with which they teach the scientific method/critical thought). There has been a lot of pushback to this, particularly in conservative areas. Why? One explanation could be general distrust of government and antipathy towards regulations/standards demanded by the federal government. Another could be over-reliance on religion, which fundamentally demands faith in the absence of evidence. I’m not going to fully wade into this murky debate right now, and before I do I’ll say that not all religion is bad, and it’s not true that no religions teach their followers to think critically and analyze things. But some religious groups don’t teach these principles, and they rely much more on encouraging their followers to just believe what they’re told by their pastor (or whatever religious leader), or face moral doom. This dangerously reinforces the idea that it’s okay to just blindly follow certain authority figures without question. It extends past the church doors, to the school teacher slut-shaming high school girls instead of providing them with comprehensive sex education, to the public official who expresses skepticism over climate change despite an abundance of evidence that it’s happening and caused by humans. People believe what these authority figures say. We’re seeing it now, as more than half of Republicans accept Trump’s claim that he really won the popular vote, with the percentage being higher among Republicans with less education. One of the main reasons we hear from Trump supporters why they voted for him was his tough stance on immigration, and they seem to be happy with his controversial travel ban, even though just recently the Department of Homeland Security found that people from the countries targeted by the ban pose no extraordinary threat compared to people from other Muslim-majority countries (Trump rejected this report, even though it was ordered by the White House). This underscores the importance of providing people with an opportunity to learn and practice critical thought. Rather than putting in a tiny bit of effort to look for the evidence that those claims are true, or thinking about their news source to figure out if it’s biased or not, they blindly trust what Trump says.

Fixing the education system to provide more training in critical thought and use of the scientific method is absolutely necessary long-term, but right now we need a strategy to deal with people who don’t have that training. So if you have conservative friends or family and you’re brave enough to talk politics with them, ask them why they believe things to be true. If they cite something as evidence that isn’t rigorous, ask them why they trust that source. I think it’s possible to do this without talking down to them; most people are probably capable of applying logic in their thinking even if they haven’t been trained to do so. Instead of trying to argue that someone is wrong and back it up by saying “here’s a fact to support my argument and you should believe it because it’s true” (even if that’s accurate), ask them why they think your fact isn’t true, and respectfully lead them to your evidence to back up your argument. Perhaps it won’t work all the time, but if the seed of skepticism can be planted in at least some people, they may be more careful in their voting decisions in the future. If you rely on logic and rational skepticism to make your decisions, you have an obligation to help other people do the same. It’s worth a try.

 

Featured image: Barbara Lee’s town hall, February 18th, Oakland.