TBT: Of monsters and developmental mysteries

Once upon a time, I applied for an internship at my local public radio station. While I didn’t end up getting it, I sent them this writing sample that gives an informal explanation of developmental biology, which I stumbled across today. Since I am a freshly-minted PhD as of this month (yay!), and still love development after devoting most of the last 10 years to studying it, I feel that my thoughts on development from several years ago still apply. So, TBT, here’s my waxing poetic on the underpinnings of all multicellular organisms:

Few things are as fascinating as creatures mistakenly born with a single central eye, or those with too few or too many fingers on each hand. How did they get to be that way? When things go so drastically wrong, what is the culprit, and should we care? It would be easy to dismiss such anomalies as freaks of nature. After all, they are exceedingly rare. But in the study of developmental biology these monsters are the key to understanding why most living things aren’t that way, and how they form correctly. The mistakes show us not only who we are, but why we are.

Every living thing, whether it’s plant, animal or fungus, has an instruction manual that tells how to make more of it. This manual, known as the genome, contains the code for every building material, every map of where things go, and every step needed to make and maintain a creature. The manual has a section (called a gene) on how to make a molecule named Sonic Hedgehog, which looks nothing like the video game character. However, this molecule and the instructions telling where to make it are crucially responsible for ensuring that a creature has two eyes on its face and five digits on its hand. Without Sonic Hedgehog, creatures are born cyclopic, with a single central eye, or with only one digit on each limb. Change the instruction manual and you’ll change how or what gets made. These changes, called mutations, are often disastrous. But every so often a creature is born with a mutation that helps it, gives it a little superpower over its peers, so that it’s a little more successful at reproducing. When this happens the mutation is passed along to the progeny of the original mutant, and those progeny are more successful than their peers, and so on until the mutation becomes present in every member of the species, or until a new species entirely breaks off on its own. Mutation, the anarchy of the genome, is responsible for the incredible diversity among living things.

Developmental biologists love firstly to break things, and secondly to hijack them. We use mutation, nature’s own act of rebellion and innovation, to control development so that we can study it in animals like mice and chickens. Remember how the instruction manual includes a map for where things go? This applies to Sonic Hedgehog. By destroying, via mutation, the part of the manual that says where to put Sonic Hedgehog in the growing limb, we can block the growth of digits there. On the other hand (no pun intended), if we duplicate the map for where to put Sonic Hedgehog and put the copy on the other side of the limb, we can cause extra digits to sprout in mirror image to the original ones. By creating this controlled disorder developmental biologists may be making monsters, but the monsters let us answer the deepest questions about ourselves.

By looking for the misfits, by breaking the system in myriad ways, developmental biologists seek to understand. We study chaos so that we can see what order is supposed to look like, and so that we can recognize and fight chaos effectively when it knocks on our doors in the form of diseases like cancer. And sometimes we study chaos simply because we’re drawn to it, because it’s a fascinating corruption of ourselves. We study mutants because that’s what we are.

 

Featured image: my favorite model organism.

Let the kids vote.

I think it’s time to lower the voting age to 16.

This is something I’ve felt strongly about since I was in high school and bummed that I couldn’t vote in the 2004 presidential election (my parents can attest to me loudly voicing my frustration on this matter). I REALLY wanted to vote in high school. I registered way before I turned 18, even though I knew I’d have to re-register with a new address before an election I could vote in actually came up. Hell, I was so annoyed that I missed being able to vote in the 2006 midterms by a few months, when it looked like my district might finally swing to the left and we could boot our corrupt Republican Congressman, that I tried to convince my apathetic 18-year-old friend to register and let me “advise” her on how to fill out her ballot. Yes, I know that is illegal, but dammit, I was more politically aware than plenty of adults, and it seemed unfair and nonsensical that they had this right and I didn’t. I even wandered into my precinct’s polling place and casually asked them if I could fill out a ballot (spoiler: they didn’t let me). I was really excited to be part of the democratic process and I went out of my way to inform myself on the issues. Though most of my peers probably didn’t feel the same way, I don’t think that apathy is inevitable. I think it’s totally possible for teenagers to get interested and informed about politics, and to be effective— just look at the political action spearheaded by teenagers in the wake of the Parkland massacre. We should respect them by giving them the right to vote. I can think of plenty of good reasons why 16- and 17-year-olds should be able to vote, and no good reasons why they shouldn’t (though a number of arguments have been made against giving them the vote, and I’m open to hearing and debating about others). Let’s delve into it.

The maturity question

The main argument against lowering the voting age is generally that kids are too immature or don’t have enough life experience to make informed voting decisions. This is condescending and just plain false. First of all, it’s essentially saying, “Hey teenagers, you’re dumb and make bad decisions, so you can’t possibly be capable of thinking intelligently about important issues.” While it may be true that people’s brains keep developing into their mid-20s, we don’t consider that a good reason to keep anyone under 25 from voting, nor do we take voting rights away from people with demonstrable cognitive deficiencies. As long as you are older than 18 and not a felon, you can vote even if you have dementia, or are blackout drunk, or believe that you are second coming of Christ, or think that Hillary Clinton was running a child sex trafficking ring out of a pizza parlor. We don’t apply an intelligence or critical thought test to anyone over 18 before giving them the right to vote, and yet we automatically assume that people under 18 aren’t cognitively advanced enough to handle the issues being voted on. Give me a break, and give kids some credit. They have valuable things to say, and many have done more to further an intelligent civic discourse than most adults.

Furthermore, I doubt there’s much difference in the cognitive abilities of someone who is 18 versus someone who is 17.5, and yet the 18-year-old gets to vote while the 17.5-year-old doesn’t. The voting age is kind of arbitrary in this way. The counterargument to that would be that we have to set the cutoff somewhere, but by setting it at 18, we’re disenfranchising people who are fully capable of making informed voting decisions. The type of cognitive skills needed for voting are known as cold cognition, and they are established before a person turns 16. These are the skills that allow people to make rational decisions based on assessing evidence before coming to a conclusion, without being influenced by emotion. Teenagers can absolutely do this, and they can absolutely use those skills to make responsible choices at the ballot box. In fact, in Austria, one of the only countries to allow 16- and 17-year-olds to vote, research shows that youth voters make voting decisions that are similar to those of people over 18. So no, they aren’t irrational and impulsive at the ballot box. When it comes to their voting choices, they’re just like all of us supposedly older and wiser voters.

Making the political process belong to them

Allowing 16- and 17-year-olds to vote would get them engaged in the political process early. Right now for a lot of teenagers, it probably feels pointless to pay attention to politics (or government matters in general) because they have no say in it, and it seems like something not worth worrying about until they are considered adults. They don’t feel any ownership of it because they’re excluded from participating in it. But political decisions affect everyone in this country, including those too young to vote. Congress kowtowing to the NRA and refusing to pass meaningful gun control has allowed kids to be massacred while trying to get an education. Even if those kids want to vote out the politicians who sit on their hands while angry white men/boys shoot up schools, something that directly impacts kids in a literal life-or-death way, they can’t do it. But if they were allowed to vote, they could feel empowered to make great changes in society and work to ensure their own safety (which they shouldn’t have to do, but that’s what things have come to). They could feel like civic engagement was another milestone to reach in adolescence, as opposed to an opt-in burden to take on as an adult or something that doesn’t belong to them. Let’s give kids a better reason to get engaged than having to fear for their lives due to school shootings.

Increasing voter turnout and easing the transition to civic engagement

We have a problem with voter turnout in the US. Youth voter turnout is particularly low; 46% of eligible voters aged 18-29 voted in the 2016 presidential election, and turnout is even lower during midterm elections. Of course, this doesn’t help with the stereotype that young people are uninformed and apathetic. Facilitating voting in schools would be a great way to easily bring young people into the voting process. On election day, there could be polling sites at high schools, and students 16 and older could be granted time to go and cast their ballots. Perhaps teachers could even give extra credit to students who vote, proven by bringing back a ballot stub to class after going to the polls (their peers who are younger than 16 could also receive extra credit for learning about the issues and voting in a mock election, to make things equitable). Nonpartisan information about the issues could be distributed to students in class (the same information that gets mailed to adult voters). I think that getting people started voting early, so they see what it’s like and get in the habit of learning about the issues and candidates, will set them up to keep voting later in life when they have to be independently responsible for it.

Right now we don’t do much to make teenagers feel any ownership of voting or political engagement, but then we expect them to immediately become independently engaged the moment they turn 18. Some adults even have the gall to chastise young people who don’t vote and blame them for unfavorable political outcomes. Fuck off with that argument. We aren’t setting teenagers up to vote easily or to care about civic engagement, when we definitely could. Why not help them? Obviously this won’t solve all of the voter turnout problems in the US, but I believe that supporting people in civic engagement early on would do a lot to motivate people down the road.

No taxation without representation

16-year-olds can work and pay taxes, and yet they have no say in what their tax money is spent on. This can definitely affect them directly: they have to pay into a system, but if their Congressperson wants to defund public schools or take away regulations on student loans meant to protect the borrower, they can’t do anything about it other than suffer the consequences. This is taxation without representation, which is why the US became an independent nation in the first place. And being taxed on their income isn’t the only thing teenagers can be compelled to do without having any kind of say. Teenagers can be tried as adults in court, seemingly acknowledging that they’re capable of adult-level cognition as applied to criminal activity, but for some reason not to civic engagement. They (particularly teenage girls) can get married before the age of 18 as long as their parents “consent”, which often equates to parents forcing their daughter to marry her abuser. In the eyes of the law, she is old enough to get married and deal with all the responsibilities and ramifications associated with that, but she is not old enough to make an informed decision to vote against policies and lawmakers who support child marriage. How backwards is that? If we can force teenagers to do these things, we are obligated to give them a say in the matter.

I think this issue has more momentum now than ever before. The students from Marjory Stoneman Douglas High School have shown us repeatedly that teenagers can be public voices for intelligent expression of the issues that affect society. They are not impulsive, or irrational, or incapable of understanding problems at the same level as adults. This isn’t a new phenomenon. Teenagers have always been capable of being informed, civic-minded voters, but adults have held them back. It’s time now that we empower them to take their rightful place as participants in our democracy.

 

Featured image: the Indonesian embassy in Washington, D.C.

It’s okay to admit when you’re wrong.

I’m wrong sometimes. I’ve been wrong about a lot of things.

One clear example: when I started this blog in January 2017, I thought it was a reasonable expectation to write a new post once a week. For a lot of bloggers that’s definitely doable. But I underestimated how much time and mental energy would be taken up by my research, with two new projects going at the same time and exciting data rolling in and getting me hooked on bench science (again), and with my last semester as Art Director of the Berkeley Science Review bringing responsibilities and a chance to put my often-dormant art skills to use (see: the cover for the spring 2017 issue). I also started blogging for the BSR, which took up what little bandwidth for writing I had. I wasn’t slacking off, but I still felt guilty for neglecting my blog, mostly because I had made an unrealistic promise to myself to write more than I could. I’m not trying to make excuses for myself, but more to be honest about the reality of the situation so that I won’t feel guilty when I inevitably blog less often than I aspire to, and so I can have more realistic expectations in the future. And this is okay.

I’m going to be an armchair psychologist for a second. People often feel uncomfortable being wrong, to the point where sometimes when presented with solid evidence disproving our claims, we’ll dig our heels in deeper. I wonder if this is just human nature, or if this has to do with people never learning to be comfortable with being wrong sometimes. When we realize we’re wrong, or when someone calls us out on it, we react as if it’s a personal attack as opposed to an objective fact that has nothing to do with our personal character. Or people see being visibly wrong as demonstrating that we are unintelligent, or careless, or incompetent: all things that could affect others’ perception of us, and people in general don’t want to be disliked or shunned by others (I think very few people truly give zero fucks about what others think about them, much as some of us try to behave that way).

A little bit of worrying about what others think about us is fine; it’s what allows us not to socially alienate other people and helps make us empathetic. It’s also good, and admirable, to stand firm on our beliefs when we have a strong backing for them. The problem arises when we ignore or dismiss evidence that we are wrong and continue making decisions based on our false views.

The way white Americans in general view and internalize our country’s history of mass atrocities against racial minorities is a good example of this. How many of us feel, and I mean really feel, an ever-present sense of guilt over the way black Americans were first enslaved, and then discriminated against and brutalized during the Jim Crow era? Or how Native Americans had their land seized and were forced on a death march on which thousands of people died? How many of us, thinking back to U.S. history class in high school, involuntarily felt their stomach knot in disgust when learning about these things, because we were instilled with the idea that these atrocities were, and still are, a national responsibility to be shouldered by every American? Do we constantly think about how the historical fact of these hideous acts continues to affect our society now, and how exactly it contributes to the challenges faced by African American and Native American communities? No, most white Americans probably don’t. I’m willing to bet most of us learned about these things and thought, “Well, that’s terrible,” and on a deeper level, “It’s in the past and it’s not my problem”. Many of us were probably even taught a watered-down version of these events, or worse, a very inaccurate one.

Contrast this to how things are in post-World War II Germany*: people learn about the Holocaust multiple times in school, in great detail, with the intent to accept what happened but not forget it, and to learn from it. There’s some level of national collective guilt. It’s not that Germans whose grandparents fought in World War II should feel as though they are personally responsible for the actions of the Nazis, but rather that they are responsible for remembering what happened, recognizing how terrible it was, and preventing something that horrific from happening again. We just don’t do that in the U.S. We acknowledge that mistreatment of racial minorities was bad, but we then distance ourselves from it and don’t learn about it extensively, shrouding ourselves in a protective blanket of ignorance. We don’t fully take on the historical responsibility of being wrong. And because of this, an undercurrent of racism has been allowed to flourish.

In current U.S. politics, admitting you’re wrong is often seen as a sign of weakness. Politicians won’t admit they’re wrong unless it’s so glaringly obvious that people have figured it out without them even needing to admit it. They do this because if they do admit they’re wrong, they face serious backlash; their constituents lose faith in them, assuming they have no deeply held principles, or can’t be decisive, or that they aren’t loyal. This has to stop. We need to be more accepting of people being wrong, as long as they acknowledge it and try to make it right. Otherwise there’s no incentive for people to correct their mistakes, and all the more reason for them to dig their heels in and rationalize sticking to information that’s obviously wrong, or ignore information counter to their incorrect beliefs. We need to place more value on honesty, as opposed to stubborn refusal to entertain solid ideas counter to your own. We need to admit that we aren’t perfect, and that that’s okay, as long as we’re consistently trying to better ourselves.

 

~~

*I am not an expert on how Germans handle teaching of the Holocaust and have no personal experience with it, so I’m basing my opinion on what I’ve heard from the German people I’ve asked about this issue and some Googling. I don’t mean to put Germany on a pedestal; rather I think this is a good example of doing the right thing.

 

Featured image: A very big newt in Lake County.

 

 

New BSR blog: The Moral Responsibility of Genome Editing

Hey all, I’ve been busy with so many things over the last few months, one of which was writing this blog post for the Berkeley Science Review on the ethics of genome editing. We’re approaching an era where we can relatively easily edit genes in the cultured cells of sick patients, or in patients’ cells within their bodies, or even in early embryos— this brings up a whole host of ethical issues to consider. Check out the article for a discussion of these issues!

This is us.

Perhaps the “this is not what American values represent” attitude towards racial violence is aspirational; people want to believe that our country has reached a level of racial equality and peace such that it defines our society. At best, that involves a hope that we have overcome previous societal racism, with a vague memory of what that racism actually entailed. At worst, it involves denial that societal racism was ever really that bad, or defensive justification of it. But to respond to incidents like Charlottesville by saying “this isn’t us” ignores the stain of racial violence on American history, as well as how that history continues to impact our society.
 
This IS us, and it’s been us for centuries. “American values” used to represent enslavement of black people, and unchecked violence against them for generations after emancipation. “American values” used to represent denial of basic citizenship rights that white people take for granted, like the ability to vote or go to school, to choose where to own or rent property, or congregate freely in public places. One could justifiably argue that “American values” still effectively represent these things, and that it just looks slightly different than it did 100 years ago. What America has not done, unlike other nations that have committed large-scale atrocities, is admit what we did wrong and instill in our society a level of collective guilt about it.
 
Yes, it’s uncomfortable to own up to this history, but it’s what we have to do. Racism against black people is woven tightly into the legacy of America and nothing will change that part of our past. However, we can work to change its impact on our future as long as we recognize this fact. We cannot truly address the problem if we deny its existence.
(Originally written as thoughts in response to this article from The Atlantic.)

Hatred and bigotry: the Republican legacy.

I’m a little late in sharing this, but if you haven’t seen it yet, Vice News created a video that is an incredible look into the far right white supremacists mobilized by Trump who sparked the violence in Charlottesville, and seek to do so elsewhere. It’s incredibly disturbing, to put it mildly.
I am Jewish. My whole family is Jewish. The men shown in this video don’t know me or anything about me, but they state here that their ideal America is one that is “cleansed” of me and people like me. If you care about me, or any other Jewish person or person of color or queer person, this should chill you to the bone. If you are a decent human being, this should make you ill.
These white supremacists have been courted by the Republican party for years, and the GOP has relied on their support while simultaneously pretending not to encourage them. Now we’re seeing the results of this Republican strategy: a president who campaigned on racial hatred, who egged on his violent far right supporters and relied on them to win, who refuses to strongly denounce these white supremacists and instead insists that many of them– many of the neo Nazis, the KKK, the people who thought that running over nonviolent counterprotesters with a car was justified– are “fine people”. A president who supports the racial profiling and inhumane treatment perpetrated by Joe Arpaio so much that he pardoned him. Emboldened white supremacists who seek to terrorize others. A majority of Republicans in Congress, sitting on their hands and doing nothing to help, but talking a lot about how “troubled” they are. They are the ones in the position to do something, and many of them recognize how wrong this overt bigotry is and that something should be done, but they are silent. Cowards.
This hatred and bigotry is the Republican legacy. People who voted Republican, and especially those who voted for Trump, voted for this. Even if you didn’t approve of Trump’s racist rhetoric and voted for him for other reasons, you tacitly approved this. It’s now your responsibility to stand against the white supremacists and far-right bigots, and to call Trump out for supporting them. If you remain silent, you are continuing your approval of their actions. You’re saying, “This racist terrorism is fine by me.” That makes you a coward too; and if you don’t believe you are a coward, you should ask yourself just what kind of person you truly are.

Weaponizing the scientific method

We’re experiencing an unprecedented level of lying in American politics right now, mostly courtesy of our “so-called” president. What is there to do about this? Right now it seems like no amount of calling out bullshit stops Trump from baldly lying, or his supporters from accepting the lies, or congressional Republicans from spinelessly hiding in a corner pretending it’s not happening. Trump is doubling down on his lies and lashing out at the media for exposing him, which threatens our free press and our ability to fight back against his wannabe-authoritarian regime. I’ve talked about how hopeless this makes everything seem before, but also how we don’t have much of a choice except to fight back. We are still scrambling to figure out the best way to fight.

As a scientist and someone who prefers to rely on logic to make decisions, I believe that one way to do this is by teaching people to get comfortable using the scientific method all the time. It’s not really that hard once you train yourself to think that way and apply it to all kinds of situations, even those that occur outside of the lab. So what is the scientific method? In a nutshell, it’s a set of principles used to conduct evidence-based inquiry. It’s founded in the idea that you can make an observation (“X phenomenon occurs”), extrapolate a hypothesis to explain the observation (“Y is acting on Z to cause X phenomenon”), and then do tests to find out if your hypothesis is correct or not (“If I remove Y, does X phenomenon still occur?”). A good scientist will form multiple possible hypotheses, including null hypotheses (“Y is not acting on Z to cause X” or “Y and Z have nothing to do with X”), and set up tests that will generate observations to address each hypothesis. At the root of the scientific method is thinking of multiple possible scenarios and applying skepticism to all of them, as opposed to just accepting the first one you think of without questioning it further. Oftentimes you come up with a hypothesis that makes logical sense and is the simplest explanation for a phenomenon (i.e. it’s parsimonious), but when you investigate it further you find that it is not at all correct.

Let’s apply this to politics today. Here’s a basic example: Trump lost the popular vote by nearly 2.9 million votes, but he keeps claiming that this is due to massive voter fraud where millions of people voted illegally. You could just take his claim at face value and not question it. After all, he is the president, so he’s privy to lots of information the general public doesn’t have, and as the leader of the U.S. he should be acting with integrity, right? You could use this logic to form the hypothesis that he is correct and there is evidence of massive voter fraud. Or you could apply a bit of skepticism and formulate an alternative hypothesis: Trump’s claim is false, and there’s no evidence of the massive voter fraud that he cites. How to test this? You can look for the evidence that he claims exists just by Googling… and you won’t actually find any. What you will find are instances of his surrogates claiming that the voter fraud is an established fact, various reputable news agencies and fact checkers debunking the claim (even right-leaning Fox News admits there is no evidence for the claim), and a complete dearth of any official reports that massive voter fraud occurred (which I would link to, but I can’t link to something that doesn’t exist). In support of the claim you will find right-wing conspiracy websites like InfoWars that don’t cite any actual evidence. So based on that inquiry, you could conclude that Trump’s claim is false. It didn’t take much skepticism or effort to address the question, just enough to ask “Can I easily find any solid evidence of this?”

One problem with this strategy: we aren’t doing a great job at teaching people how to think rationally and critically. On a large scale, we would do this by refocusing our education standards around critical thinking (part of what Common Core and Next Generation Science Standards aim to do, albeit with mixed results over the methods and implementation with which they teach the scientific method/critical thought). There has been a lot of pushback to this, particularly in conservative areas. Why? One explanation could be general distrust of government and antipathy towards regulations/standards demanded by the federal government. Another could be over-reliance on religion, which fundamentally demands faith in the absence of evidence. I’m not going to fully wade into this murky debate right now, and before I do I’ll say that not all religion is bad, and it’s not true that no religions teach their followers to think critically and analyze things. But some religious groups don’t teach these principles, and they rely much more on encouraging their followers to just believe what they’re told by their pastor (or whatever religious leader), or face moral doom. This dangerously reinforces the idea that it’s okay to just blindly follow certain authority figures without question. It extends past the church doors, to the school teacher slut-shaming high school girls instead of providing them with comprehensive sex education, to the public official who expresses skepticism over climate change despite an abundance of evidence that it’s happening and caused by humans. People believe what these authority figures say. We’re seeing it now, as more than half of Republicans accept Trump’s claim that he really won the popular vote, with the percentage being higher among Republicans with less education. One of the main reasons we hear from Trump supporters why they voted for him was his tough stance on immigration, and they seem to be happy with his controversial travel ban, even though just recently the Department of Homeland Security found that people from the countries targeted by the ban pose no extraordinary threat compared to people from other Muslim-majority countries (Trump rejected this report, even though it was ordered by the White House). This underscores the importance of providing people with an opportunity to learn and practice critical thought. Rather than putting in a tiny bit of effort to look for the evidence that those claims are true, or thinking about their news source to figure out if it’s biased or not, they blindly trust what Trump says.

Fixing the education system to provide more training in critical thought and use of the scientific method is absolutely necessary long-term, but right now we need a strategy to deal with people who don’t have that training. So if you have conservative friends or family and you’re brave enough to talk politics with them, ask them why they believe things to be true. If they cite something as evidence that isn’t rigorous, ask them why they trust that source. I think it’s possible to do this without talking down to them; most people are probably capable of applying logic in their thinking even if they haven’t been trained to do so. Instead of trying to argue that someone is wrong and back it up by saying “here’s a fact to support my argument and you should believe it because it’s true” (even if that’s accurate), ask them why they think your fact isn’t true, and respectfully lead them to your evidence to back up your argument. Perhaps it won’t work all the time, but if the seed of skepticism can be planted in at least some people, they may be more careful in their voting decisions in the future. If you rely on logic and rational skepticism to make your decisions, you have an obligation to help other people do the same. It’s worth a try.

 

Featured image: Barbara Lee’s town hall, February 18th, Oakland.

Open access and peer review, in a nutshell

Let’s talk about something scientific.

One of the key underpinnings of the scientific process is the ability to share research results with others. Before scientists do this, we share our results with each other to get feedback on our work and suggestions on what other experiments we can do to provide more solid evidence for our claims*. This is a fundamental part of the publishing process and known as peer review, which is basically just scientists checking each other’s work. Who is more qualified to do this than other scientists in the field? If you’ve looked at an article published in a scientific journal recently, you might notice that 1) it’s very dense, and 2) the techniques used, and often the questions asked, are pretty complex. It would likely be hard for someone with no scientific training, or even a scientist from a different field, to provide useful critique, or to spot things the authors may have overlooked. So when we submit our manuscripts to journals for review, we try to have them reviewed by other scientists in our sub-field who are most familiar with the techniques and questions discussed in the manuscript (and thus, the benefits and pitfalls of what we discuss).

If scientists didn’t rely on peer review, we’d be able to publish just about anything and claim it to be fact, and then it would be up to the general public to critique it and spread the word about whether or not the results are valid. That would just be inefficient, and highly unlikely to succeed. Peer review acts as both a filter and a stamp of approval**.

After a manuscript is peer-reviewed and published in a journal, that information is theoretically available to the public and part of the established scientific knowledge base. But scientific research isn’t truly available to the public unless it’s actually accessible. Many journals are behind what’s known as a pay wall, where you have to subscribe in order to access the content beyond the abstract (a summary of what an article is about). This is similar to how the New York Times charges $2.75/week for digital access. The difference is that the subscription costs of many journals are exceedingly high, such that most individual people can’t afford a subscription, let alone multiple subscriptions to different journals. Scientists can usually access these articles because they work for a university or company that shoulders the cost of subscriptions to many journals, but depending on how well-funded your employer is, the cost may still be prohibitive.

Why is this a problem? There’s the obvious issue of forcing published science into a black box that remains mysterious to the general public, which helps to feed the perception that scientists’ work is beyond the reach of “normal people” and blocks public interest in all but the sexiest or weirdest stories. There’s also the fact that a majority of scientific research is paid for by the government, which uses taxpayer money to fund grants. So taxpayer funds are going to facilitate scientific research, but then most taxpayers can’t actually read about the research they paid for. If the research isn’t even made available to all scientists, it prevents future scientific progress (what do scientists have to build on if they don’t know the current state of the field?) This is where the idea of open access comes in.

Some publications are open access, like the PLOS journals and eLife, and these publications do not require a reader to pay to view their articles. Other publications, like Nature and Science, charge a subscription fee. Nature‘s fee is $3.90/issue. Perhaps that sounds on par with subscriptions to non-scientific magazines and newspapers, but keep in mind one fundamental difference: you can get the news from multiple sources, so if something important happens, several news agencies will report on the same story, and you don’t necessarily need to pay for it. With scientific publications, the research article will only be published in one journal, so to access all research as it comes out, you’d have to pay for a subscription to many different journals. It adds up quickly, and effectively leads to people paying twice for scientific research (assuming they already pay taxes).

Together, peer review and open access are fundamental to scientists’ ability to share our work with the public, demonstrate convincingly that our findings are accurate, and allow non-scientists to engage in scientific research. Attempts to limit this are wholly detrimental to the scientific process and public understanding of science. Last month, the Trump administration ordered a media blackout on several government agencies including the EPA, and also indicated that research from EPA scientists would need to be approved by the Trump administration so as to “reflect the new administration”. The Trump administration is not run by scientists, and it’s unclear who in the administration would be reviewing scientific results. This amounts to unqualified, politically motivated people deciding based on their agenda what science gets published—clearly problematic and fundamentally counter to widely-held standards of scientific integrity.

Regardless of who is in office, scientists should be working to improve peer review and the general public’s access to scientific research. On top of that, we should work to help non-scientists understand the process of doing research and the lengths we go to in order to demonstrate that, to the best of our knowledge, our findings are accurate. Without this line of communication, we will be forever holed up in our ivory towers, piddling away on experiments that will never make as great of an impact as they should, because people either cannot hear or cannot understand us.

~~

*Or to find out if our claims really are accurate; sometimes you do another experiment and it demonstrates that what you thought was an interesting phenomenon is actually just noise, or that it’s less significant than you thought.

**It’s not a perfect system, of course. Peer-reviewed results do get published that are eventually shown to be false upon further testing, or sometimes after it’s discovered that data was fudged. An ideal system would have scientists acting with integrity 100% of the time. But like every other field, people are sometimes deceptive, and do things that undermine the system when it looks like it will benefit them. Sometimes we publish results that we think are correct, but later advances in the field or attempts to repeat an experiment show that those results are not correct. This is an ongoing struggle, and peer review is one of the things that combats this.

Featured image: Berkeley neighborhood flower.

 

Don’t just check the box.

It’s been a very feminist week. There was a massive, worldwide march of millions to stand up for women’s rights, and to stand in solidarity with other groups targeted by the new Trump administration (read: anyone who isn’t a wealthy, white, Christian male who doesn’t criticize Trump). People were out in full force to make themselves seen, to make themselves heard; people with their children, people all across the gender spectrum, people of all colors, groups of scientists, teachers, you name it. It was incredible. And still, some people were invisible.

Last Monday I saw the movie Hidden Figures, which told the story of three African American women “computers” working at NASA in the 1960s and the struggles they faced to advance their careers and be taken seriously. These women—Katherine G. Johnson, Dorothy Vaughan, and Mary Jackson— were instrumental to the success of the space program, but because of two inherent qualities they possessed (their femaleness and their blackness), they had to fight so much harder to access opportunities that white men were handed, they were not given nearly as much credit as they deserved, and they have been overlooked by history. I never learned about these three women in school. If you had asked me before Monday to name a historic black NASA employee, I regrettably wouldn’t have been able to do so. This is partially my fault for not being proactive in learning about African American history beyond the snippets taught in my history classes, but it also shows the gaps that persist in the education system and our culture. People are omitted, because omitting women and people of color is a normalized action in our culture. Perhaps in my case, this is a symptom of having gone to suburban schools in a mostly white area. People of color, especially black people, were relatively uncommon, and since they weren’t visibly present, it was easy to forget to talk about their existence and contributions. Their history was largely masked to me.

Just as the three protagonists of Hidden Figures have been overlooked, so have the people who laid the groundwork for the Women’s March that occurred on January 21st, 2017. One of my favorite podcasts, Stuff Mom Never Told You (RIP, I don’t know where I will get my feminist fix without you), recently featured an episode on women’s marches and highlighted the fraught racial history of the feminist movement. While white feminists in the 19th and 20th centuries fought for the right to vote, they simultaneously marginalized and elected to exclude black women from the movement. Much later, in the 1990s, black women organized the Million Woman March in Philadelphia, with the intent of empowering black women and with goals including increasing access to education and healthcare for black communities. But I doubt most people marching in 2017 know much about this history. I certainly didn’t until I listened to that SMNTY episode. If feminism as a movement is going to succeed, it must be inclusive, and it must take this history and the current social climate into account. Yes, it feels great to march and send a message that we stand with each other and against the tyranny of the Trump administration, but it’s not enough to actually make a change.

So how do we make change happen? White feminists especially need to start by being aware that women of color have had, and continue to have, greater struggles to be taken seriously and claim their rights. Be aware of the people who came before us and did the work decades ago. Listen to what they have to say, and ask what you can do to help them. Stand beside them when they march for their rights, even if it doesn’t directly affect you, the way men joined the Women’s March in 2017. Keep their concerns in mind as you continue to fight for people’s human rights, hopefully by calling your representatives and getting involved. Use whatever privilege you have to elevate their voices. I will try to do my part by blogging about underrepresented groups and people to raise awareness, even if my impact is small. And of course, I’m open to suggestions, because I certainly don’t know everything, and to some extent I’m a product of my upbringing and all the biases and ignorance it came with.

This is all in the hope that one day we’ll fulfill our lofty goal of achieving equal opportunity for everyone, regardless of race, gender, class, orientation, or whichever method of pigeonholing people you choose. By overlooking women and people of color, even by passively excluding them just because we don’t think about them, we are missing out on the potential of all those people who never got the chance to apply their talents to the best of their abilities. We are shortchanging our society by keeping doors closed to underrepresented groups. Things have improved slowly over time, yes, but we are not living in a post-racial society. If anything, Obama’s presidency made an obvious demonstration of that. So as we move into the age of anti-Trump resistance and attempts to redefine the truth, let’s all make an effort to learn about the history of our movement(s), even if that history is unpleasant, and take it into account as we act to protect and expand civil rights. And as Caroline from SMNTY said, don’t just attend this one march, check the box to say you participated, and call it a day. Take action, keep fighting, and make an effort. This is our job.

Featured image: Women’s March in Oakland, January 21st, 2017. Picture courtesy of G. Mannell.

Dear President Obama

This week I wrote a letter to President Obama, which I will post below (the version I sent was shortened somewhat to accommodate the White House website’s 2500 character limit). I decided to focus on some of the positives of Obama’s presidency, because there’s no point in critiquing him in such a short letter sent days before his presidency ends. I do not agree with everything he’s done as President, but overall I think he has accomplished a great deal and doesn’t always get the credit he deserves. For example, he averted a crisis by working to halt and reverse the recession, keeping it from becoming a full-blown depression. But I think a lot of people don’t think of this as one of his huge accomplishments, because he caused something not to happen; it was almost an invisible achievement. Think of where we’d be if all the President and Congress had done was to say “tax cuts for everyone!” He managed to pass a healthcare bill, something which other Presidents and many Congresspeople had tried and failed to do for decades. Even if it doesn’t last past Trump’s first year, people will start to miss it when it’s gone, and those who took it for granted will realize how important it was. He normalized acceptance and support of LGBT rights at the executive level. He fought to protect women’s rights and the rights of undocumented immigrants. He rebuilt our country’s reputation around the world.

Most of all, he was incredibly competent. He stood for calm, measured rationality, even in the face of intense and often unfair (or straight up racist) criticism. He set an example for how a leader should act. These things and more will set him in stark contrast to his successor.

Here is the longer version of my letter:

Dear President Obama,
First, I wanted to thank you immensely for your eight years serving us as President. You were the first person I voted for (when I was 19 and voted for you in the 2008 primary), and you have been an enormous inspiration to me and so many other young people for years. You were a light in the storm, and I am so grateful that you set an example of hope, pragmatism, and competence when many of us were feeling jaded from eight years of Bush. When you won in 2008, I burst outside of my college apartment to see dozens of people dancing and cheering in the streets, elated at the historic significance of an African American being elected President and the potential we all saw in your goals. I have not been disappointed, and I believe that tens of millions of Americans would probably love to give you a third term (though it would be totally understandable if you felt done, with or without the 22nd amendment).
It would take too long to thank you for everything you’ve accomplished, but a few things that specifically affected me are the Affordable Care Act, which I benefited from in multiple ways (most importantly by [redacted because of personal medical information that I don’t feel like sharing publicly at this moment]), your steadfast support for women’s rights, and your support of scientific research and critical thought. I don’t think you get enough credit for everything you’ve done, and you certainly would have accomplished more if it weren’t for the Republicans throwing a temper tantrum for eight years. It means so much to me and other progressives to see you fighting for those causes and trying to bring about meaningful change, even if your efforts are blocked.
Perhaps what I will miss most, though, is having you as the figurehead for our country. You and your family are paragons of decency. What’s more, you truly value rationality and forming your opinions based on solid evidence. Those are two things that will be sorely lacking in the incoming administration, and it’s been shocking to watch so many people in our country sit by as Donald Trump and his ilk trample upon the core principles of our society. I’ll admit that it’s been very hard to maintain a sense of hope that we will somehow bounce back from this disaster. I was hopeful that if Hillary Clinton won we could address a number of progressive issues, but now those things seem so far off the table because we’re going to be doing damage control for however many years the Trump dumpster fire lasts, and then clawing our way back to where we were when you left office, that we’re not even close to joining the rest of the developed world on things like paid parental leave and affordable college. We’ve been set so far back that people are having to re-learn what a fact is. So I ask that you stay vocal during your “retirement”. We need you as the voice of reason and morality, as you’ve always been.
Best wishes to you, Michelle, Malia, and Sasha. And once more (genuinely): Thanks, Obama!

Featured image: election night, 2012.