About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
Not Voodoo

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
Realizations in Biostatistics
ChemSpider Blog
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Eye on FDA
Chemical Forums
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa

Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
Gene Expression (I)
Gene Expression (II)
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net

Medical Blogs
DB's Medical Rants
Science-Based Medicine
Respectful Insolence
Diabetes Mine

Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem

Politics / Current Events
Virginia Postrel
Belmont Club
Mickey Kaus

Belles Lettres
Uncouth Reflections
Arts and Letters Daily
In the Pipeline: Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline

In the Pipeline

April 17, 2014

Gitcher SF5 Groups Right Here

Email This Entry

Posted by Derek

I think that several of us in medicinal chemistry have been keeping our eyes out for a chance to work in a pentafluorosulfanyl (SF5) group. I know I have - I actually have a good-sized folder on the things, and some of the intermediates as well, but I've never found the right opportunity. Yeah, I know, they're big and greasy, but since when that that ever stop anyone in this business?

Well, here are are some new routes to (pentafluorosulfanyl)difluoroacetic acid, a compound that had previously only existed in a few scattered literature reports (and those from nasty chemistry). So we all have even less of an excuse to start polluting enhancing our screening collections with these things. Who's first?

Comments (8) + TrackBacks (0) | Category: Chemical News | Life in the Drug Labs

Changing A Broken Science System

Email This Entry

Posted by Derek

Here's a suggestion for a total reform of the graduate student/postdoc system of scientific labor and training. It's from a distinguished list of authors, and appears in a high-profile journal, and it says without any equivocation that the system we have is in major trouble:

In the context of such progress, it is remarkable that even the most successful scientists and most promising trainees are increasingly pessimistic about the future of their chosen career. Based on extensive observations and discussions, we believe that these concerns are justified and that the biomedical research enterprise in the United States is on an unsustainable path. . .We believe that the root cause of the widespread malaise is a longstanding assumption that the biomedical research system in the United States will expand indefinitely at a substantial rate. We are now faced with the stark realization that this is not the case. Over the last decade, the expansion has stalled and even reversed.

They trace the problem back to the post-World War II funding boom (Vannevar Bush's "Endless Frontier"). I have to say, the paper gives the impression (no doubt for lack of space) that the progress of funding in the biomedical sciences was smoothly upwards up until about 1990 or so, but as I understand it, the real kick was the post-Sputnik expansion. The 1960s were the real golden years for federal science and education spending, I think, as witness the profusion of buildings from that era to be found at many public universities. You can spot them from a hundred yards away, and boy, are there are lot of them. The authors lump that era in with the 1970s, but that latter decade, at least post-1973 or so, was hardly a period of a "vibrant US economy", as stated.

The doubling of the NIH's budget is also dealt with like a matador deals with a bull - a flick of the cape. But there's no doubt that the situation now isn't good:

However, eventually, beginning around 1990 and worsening after 2003, when a rapid doubling of the NIH budget ended, the demands for research dollars grew much faster than the supply. The demands were fueled in large part by incentives for institutional expansion, by the rapid growth of the scientific workforce, and by rising costs of research. Further slowdowns in federal funding, caused by the Great Recession of 2008 and by the budget sequestration that followed in 2013, have significantly exacerbated the problem. (Today, the resources available to the NIH are estimated to be at least 25% less in constant dollars than they were in 2003.)

The problem has been the same one faced by highway engineers: double the lanes on the highway, and new traffic fills up it again. Extra NIH money has been soaked up, and more, by an expansion in the customers for it. Even if their history is a bit off, the authors' analysis of the current situation seems to me to be right on target. :

The mismatch between supply and demand can be partly laid at the feet of the discipline’s Malthusian traditions. The great majority of biomedical research is conducted by aspiring trainees: by graduate students and postdoctoral fellows. As a result, most successful biomedical scientists train far more scientists than are needed to replace him- or herself; in the aggregate, the training pipeline produces more scientists than relevant positions in academia, government, and the private sector are capable of absorbing.

The result, they say, has also been Malthusian: an increasingly nasty competition for resources, which is taking up more and more of everyone's time. It's creating selection pressure favoring the most ruthless elbow-throwers and body-slammers in the bunch, and at the same time making them scientifically timid, because the chances of getting something unusual funded are too low. (Paula Stephan's thoughts on all this are referenced, as well they should be). You may now see the birth of the "translational research" bandwagon:

One manifestation of this shift to short-term thinking is the inflated value that is now accorded to studies that claim a close link to medical practice. Human biology has always been a central part of the US biomedical effort. However, only recently has the term “translational research” been widely, if un- officially, used as a criterion for evaluation. Overvaluing translational research is detracting from an equivalent appreciation of fundamental research of broad applicability, without obvious connections to medicine.

I'm not quite so sure about the evocations of the golden age, when great scientists were happy to serve on grant review committees and there was plenty of time for scientific reflection and long-term thinking. I would place those further back in history than the authors seem to, if they existed at all. But there's no need to compare things today to some sort of ideal past - they're crappy on the absolute scale, prima facie.

From the early 1990s, every labor economist who has studied the pipeline for the biomedical workforce has proclaimed it to be broken. However, little has been done to reform the system, primarily because it continues to benefit more established and hence more influential scientists and because it has undoubtedly produced great science. Economists point out that many labor markets experience expansions and contractions, but biomedical science does not respond to classic market forces. As the demographer Michael Teitelbaum has observed, lower employment prospects for future scientists would normally be expected to lead to a de- cline in graduate school applicants, as well as to a contraction in the system.
In biomedical research, this does not happen, in part because of a large influx of foreign applicants for whom the prospects in the United States are more attractive than what they face in their own countries, but also because the opportunities for discovering new knowledge and improving human health are inherently so appealing.

Too many players have an incentive to act as if things are supposed to go on the way that they have - universities get overhead out of grant money, so why not hire as many grant-bringers as possible? And pay salaries, as much as possible, out of those grants instead of from university funds? Why not take in as many graduate students as the labs can hold? The Devil is (as usual) on hand to take the hindmost.

The rest of the paper is an outline of what might be done about all this. The authors propose that these steps be phased in over a multiyear period, with a goal of making funding more sensible (and predictable), and altering the way that the academic research workforce is recruited and handled. Here are the steps, in order:

1. Require longer-term budgeting for federal research funding.

2. Gradually reduce the number of PhD students in the biomedical sciences. Support them on training grants and fellowships rather than out of research grants. The rules barring the funding of non-US citizens through these routes need to be changed, because these should become the only routes.

3. Make more funding opportunities available between science career paths and allied fields, so that there are more possible off-ramps for people with science training.

4. Gradually increase the salaries offered federally-funded post-docs, so the system doesn't overload with cheap labor. Limit the number of years that any postdoctoral fellow can be supported by federal research grants, and require salaries to be at staff scientist level if the person continues after this point.

5. Increase the proportion of staff scientists. Universities and granting institutions need to be given incentives to value these positions more.

6. Change at least some of the NIH granting mechanism to a system more like the Howard Hughes fellowships - that is, award longer-term money to outstanding people and labs, rather than to individual proposals. There should be several separate programs like this for different career stages.

7. Set aside a higher proportion of grants for "high-risk, high-reward" ideas.

8. At the same time, consider capping the total amount of money going to any one group, because of the diminishing-returns problem that seems to set in past a certain level.

9. Make grant evaluations less quantitative (number of publication, impact factors) and more qualitative. Novelty and long-term objectives should count more than technical details.

10. Broaden the reviewing groups (in age, geographical representation, and fields of expertise) to keep things from getting too inbred.

11. Start revising the whole "indirect cost recovery" system for grants, which has provided perverse incentives for institutions, with special attention to paying faculty salaries out of grant money.

The authors note that all these changes will tend to increase the unit cost of academic research and shrink research group sizes, but they regard these costs as worthwhile, because (1) the current system is artificially propped up in both regards, and (2) the changes should lead to higher-quality research overall. A lot of these idea seem sound to me, but then, I've never had to deal with the academic research environment. There will, I'm sure, be many people who look on one or more of these proposals with dismay, for various reasons. It will be quite interesting to see if this gets any traction. . .

Comments (49) + TrackBacks (0) | Category: Academia (vs. Industry) | Graduate School

April 16, 2014

One and Done

Email This Entry

Posted by Derek

Matthew Herper has a good piece in Forbes on Robert Duggan and Pharmacyclics. In the course of it, we learn this interesting (and perhaps disturbing) bit of information:

Second acts in the biotech business are hard: 56% of the drug firms that received an FDA approval between 1950 and 2011 did so only once.

And I hate to say it, but the article does not inspire confidence in Duggan's ability to break that trend. It's surely no coincidence that the profile mentions in its first paragraph that he's a major donor to the Church of Scientology, and maybe it's just my own prejudices, but when I hear that, I'm pretty much done with thinking that a person can make rational decisions.

Comments (23) + TrackBacks (0) | Category: Drug Industry History

The Latest Protein-Protein Compounds

Email This Entry

Posted by Derek

Here's a review of protein-protein interaction "hot spots" and their application to drug discovery. There have been several overviews like this over the years. This one doesn't break much new ground, but it does provide a number of recent examples, all in one place.

People approach this subject because of its intrinsic interest (how proteins interact), and in hopes of finding small molecules that can interfere. The hot spot concept meshes well with the latter - if there's some key interaction, then you have a much better chance of messing with it via a drug-like molecule, compared to the one-wrinkled-surface-approaching-another-one mode of binding. There are probably no examples at either pure end of that continuum. Alanine scanning of a protein-protein interaction will always, I think, tell you that some residues are more important than others. But are they important enough that disrupting just that one would mess up the whole binding event? And (a bigger problem) is there any reason for a small molecule to be there in the first place? That's the real kicker, because while there are probably plenty of PPIs that wouldn't take place if you jammed a 350-MW small molecule into the middle of them, there aren't as many protein surfaces offering enough binding energy for the small molecule to want to do that.

And that word "small" probably needs to be in quotation marks. One excuse for the low hit rates in screening such things has been that existing compound libraries aren't stocked with the sorts of structures that are more likely to hit. I'm not sure how valid this argument is. It's the sort of statement that's very close to tautology: the reason we didn't find any good hits in the screen is because we don't have good hit compounds - thanks! But there may well be structural biases as you go towards protein-surface binders - big lunker molecules with lots of aryl rings, if this attempt to calculate their properties is valid. Now, I don't know about your screening libraries, but the ones I've worked with already seem to have plenty of big flattish things in them already, so you still wonder a bit. But it does seem as if this area has a significantly greater chance of posing PK and formulation challenges, even if you do find something. The struggle continues.

Comments (4) + TrackBacks (0) | Category: Chemical News

Professor Fukuyama's Solvent Peaks

Email This Entry

Posted by Derek

See Arr Oh expresses some doubts about all the NMR spectral corrections we've been seeing lately. He's specifically referring to Bethany Halford's interview piece, and he has this to say after reading it:

If your group focuses on "clean up your spectra" more than "purify your compounds better," that's a communications issue. If a professor with a large group sees nothing but perfect spectra all day, two thoughts should crop up:

1. "I must have the smartest, most efficient students in the world," or...
2. "Something's fishy here."

Perfect-looking data should always be a cause for concern in any experiment. My guess is that Prof. Fukuyama was closer to Option One, though, possibly in the variant of "My group has such high standards!" But high standards or not, a series of perfect, flat, NMR spectra with no solvent and no impurities is rather hard to achieve in total synthesis, considering the quantities that are being used. Load up the tube with 50mg of material and you can make a lot of stuff look good, but you don't have fifty mgs at step thirty-four, do you? I remember putting everything I had into one NMR tube (or worse, one polarimeter tube) in my own total synthesis days, and I carried the thing down to the machine like it was a bag of gold.

But there's no doubt that in a big group, there will be people who try to slip things past the boss. I've seen it myself; I'm sure that a lot of you have. And if you're giving the boss exactly what the boss wants to see - perfection - then it's going to be a lot easier. These spectral problems look like a collaborative effort to me - expectations from above, willingness from below. And there are a lot of other groups that have done (and, I feel sure, still do) the same thing. Zapping the solvent peaks in the NMR is the least of it, in some cases.

Update: added a direct link to the Fukuyama/Yokoshima interview.

Comments (16) + TrackBacks (0) | Category: The Dark Side | The Scientific Literature

April 15, 2014

Novartis Gets Out of RNAi

Email This Entry

Posted by Derek

Yesterday brought the sudden news that Novartis is pulling their RNA interference research work. The company is citing difficulties in development, and also the strategic point that not as many disease areas seem to be open to the use of the technique as they'd like. John LaMattina has more here - it's looking more and more like this may be a good field for smaller companies like Alnylam, but not something that's going to feed the beast at a company the size of Novartis (or Merck, who exited a while ago). If there's some sort of technology breakthrough, that could change - but you get the impression that Novartis was hoping for one before now.

Comments (11) + TrackBacks (0) | Category: Business and Markets

Total Synthesis in Flow

Email This Entry

Posted by Derek

Ley%20flow.pngSteve Ley and co-workers have published what is surely the most ambitious flow-chemistry-based total synthesis ever attempted. Natural products spirodienal A and spirangien A methyl ester are prepared with almost every step (and purification) being done in flow mode.

The scheme shown (for one of the intermediates) will give you the idea. There are some batch-mode portage steps, such as 15 to 16, mainly because of extended reaction times that weren't adaptable to flow conditions. But the ones that could be adapated were, and it seems to have helped out with the supply of intermediates (which is always a tedious job in total synthesis, because you're either bored, when things are working like they always do, or pissed off, because something's gone wrong). Aldehyde 11 could be produced from 10 at a rate of 12 mmol/hour, for example.

The later steps of the synthesis tend much more towards batch mode, as you might imagine, since they're pickier (and not run as many times, either, I'll bet, compared to the number of times the earlier sequences were). Flow is perfect for those "Make me a pile of this stuff" situations. Overall, this is impressive work, and demonstrates still more chemistry that can be adapted usefully to flow conditions. Given my attitude towards total synthesis, I don't care much about spirodienal A, but I certainly do care about new ways to make new compounds more easily, and that's what this paper is really aiming for.

Comments (11) + TrackBacks (0) | Category: Chemical News

Sweet Reason Lands On Its Face

Email This Entry

Posted by Derek

This study has implications for many fields of science where its practitioners keep running into rumor and conspiracy theories. The authors tried several different means to increase the uptake of the MMR vaccine (information about the lack of connection with autism, information about the severity of the diseases being pervented, case histories of children who'd had them, and so on), and compared them to see if anything helped with parent who were skeptical of having their children vaccinated.

You can probably guess: none of these helped at all. In fact, several of the interventions appeared to make things even worse, reinforcing beliefs in the dangers of vaccination. There's a general principle at work here, which I've heard stated as "You can't use reason to talk someone out of a position that they didn't arrive at by reason". It's the wrong tool for the job, like using a screwdriver to pull nails. I'd also note that people who are suspicious of vaccines are also likely to be alert to signs that someone is trying to convince them otherwise, and will react accordingly. They know that their position is a minority one - that's part of the attraction, in many cases.

"Here, read this pamphlet from the CDC" is a strategy with no hope whatsoever of working. The case-history approach was probably a better idea, but just the fact that it's coming from some official medical source is enough, in these cases, to discredit it completely. That's what they want you to think. In the context of this blog, I run into this sort of thinking most often in the form of "Big Pharma doesn't want to cure anything", or even "Big Pharma knows how to cure cancer, but doesn't want to tell anyone because it would hurt their profits". The only way I've ever made any headway with that one (and it hasn't been very often) is when I've had a chance to go one-on-one with a believer. Looking someone in the eye and asking them if they really are accusing me of watching some of my family members die from diabetes, cancer, and heart disease while I was hiding the cures and collecting my paycheck is an uncomfortable conversation, but I've had it a few times. The only counterattack has been that no, they're not saying that I personally have these things in my desk drawer, it's the higher-ups, you know, them. "So how have I been working on these diseases for 25 years without rediscovering any of these cures?" I ask, and that generally winds things up. But I like to think (or to kid myself) that I've planted a slight seed of doubt.

You need as much conviction in your voice as the quacks have, though, and that's not easy, because they have a lot. Science has the evidence on its side, naturally, and that's a lot, but conspiracy theorists and their friends have something to believe in, and that's a very strong part of human nature indeed. It is not satisfied by contemplating charts or tables; it does not find fulfillment in double-blinded trials. It provides a ward against fear, the comfort of knowing secrets that others don't, and a fellowship of like-minded believers. In many cases, when you're trying to persuade someone out of these views, you're not just trying to argue a specific point - you're trying to talk them out of an entire worldview. CDC pamphlets don't stand a chance.

Comments (36) + TrackBacks (0) | Category: Snake Oil

April 14, 2014

More on the Science Chemogenomic Signatures Paper

Email This Entry

Posted by Derek

This will be a long one. I'm going to take another look at the Science paper that stirred up so much comment here on Friday. In that post, my first objection (but certainly not my only one) was the chemical structures shown in the paper's Figure 2. A number of them are basically impossible, and I just could not imagine how this got through any sort of refereeing process. There is, for example, a cyclohexadien-one structure, shown at left, and that one just doesn't exist as such - it's phenol, and those equilibrium arrows, though very imbalanced, are still not drawn to scale.
Well, that problem is solved by those structures being intended as fragments, substructures of other molecules. But I'm still positive that no organic chemist was involved in putting that figure together, or in reviewing it, because the reason that I was confused (and many other chemists were as well) is that no one who knows organic chemistry draws substructures like this. What you want to do is put dashed bonds in there, or R groups, as shown. That does two things: it shows that you're talking about a whole class of compounds, not just the structure shown, and it also shows where things are substituted. Now, on that cyclohexadienone, there's not much doubt where it's substituted, once you realize that someone actually intended it to be a fragment. It can't exist unless that carbon is tied up, either with two R groups (as shown), or with an exo-alkene, in which case you have a class of compounds called quinone methides. We'll return to those in a bit, but first, another word about substructures and R groups.
Figure 2 also has many structures in it where the fragment structure, as drawn, is a perfectly reasonable molecule (unlike the example above). Tetrahydrofuran and imidazole appear, and there's certainly nothing wrong with either of those. But if you're going to refer to those as common fragments, leading to common effects, you have to specify where they're substituted, because that can make a world of difference. If you still want to say that they can be substituted at different points, then you can draw a THF, for example, with a "floating" R group as shown at left. That's OK, and anyone who knows organic chemistry will understand what you mean by it. If you just draw THF, though, then an organic chemist will understand that to mean just plain old THF, and thus the misunderstanding.

If the problems with this paper ended at the level of structure drawing, which many people will no doubt see as just a minor aesthetic point, then I'd be apologizing right now. Update: although it is irritating. On Twitter, I just saw that someone spotted "dihydrophyranone" on this figure, which someone figured was close enough to "dihydropyranone", I guess, and anyway, it's just chemistry. But they don't. It struck me when I first saw this work that sloppiness in organic chemistry might be symptomatic of deeper trouble, and I think that's the case. The problems just keep on coming. Let's start with those THF and imidazole rings. They're in Figure 2 because they're supposed to be substructures that lead to some consistent pathway activity in the paper's huge (and impressive) yeast screening effort. But what we're talking about is a pharmacophore, to use a term from medicinal chemistry, and just "imidazole" by itself is too small a structure, from a library of 3200 compounds, to be a likely pharmacophore. Particularly when you're not even specifying where it's substituted and how. There are all kinds of imidazole out there, and they do all kinds of things.
So just how many imidazoles are in the library, and how many caused this particular signature? I think I've found them all. Shown at left are the four imidazoles (and there are only four) that exhibit the activity shown in Figure 2 (ergosterol depletion / effects on membrane). Note that all four of them are known antifungals - which makes sense, given that the compounds were chosen for the their ability to inhibit the growth of yeast, and topical antifungals will indeed do that for you. And that phenotype is exactly what you'd expect from miconazole, et al., because that's their known mechanism of action: they mess up the synthesis of ergosterol, which is an essential part of the fungal cell membrane. It would be quite worrisome if these compounds didn't show up under that heading. (Note that miconazole is on the list twice).
But note that there are nine other imidazoles that don't have that same response signature at all - and I didn't even count the benzimidazoles, and there are many, although from that structure in Figure 2, who's to say that they shouldn't be included? What I'm saying here is that imidazole by itself is not enough. A majority of the imidazoles in this screen actually don't get binned this way. You shouldn't look at a compound's structure, see that it has an imidazole, and then decide by looking at Figure 2 that it's therefore probably going to deplete ergosterol and lead to membrane effects. (Keep in mind that those membrane effects probably aren't going to show up in mammalian cells, anyway, since we don't use ergosterol that way).

There are other imidazole-containing antifungals on the list that are not marked down for "ergosterol depletion / effects on membrane". Ketonconazole is SGTC_217 and 1066, and one of those runs gets this designation, while the other one gets signature 118. Both bifonazole and sertaconazole also inhibit the production of ergosterol - although, to be fair, bifonazole does it by a different mechanism. It gets annotated as Response Signature 19, one of the minor ones, while sertaconazole gets marked down for "plasma membrane distress". That's OK, though, because it's known to have a direct effect on fungal membranes separate from its ergosterol-depleting one, so it's believable that it ends up in a different category. But there are plenty of other antifungals on this list, some containing imidazoles and some containing triazoles, whose mechanism of action is also known to be ergosterol depletion. Fluconazole, for example, is SGTC_227, 1787 and 1788, and that's how it works. But its signature is listed as "Iron homeostasis" once and "azole and statin" twice. Itraconzole is SGTC_1076, and it's also annotated as Response Signature 19. Voriconazole is SGTC_1084, and it's down as "azole and statin". Climbazole is SGTC_2777, and it's marked as "iron homeostasis" as well. This scattering of known drugs between different categories is possibly and indicator of this screen's ability to differentiate them, or possibly an indicator of its inherent limitations.

Now we get to another big problem, the imidazolium at the bottom of Figure 2. It is, as I said on Friday, completely nuts to assign a protonated imidazole to a different category than a nonprotonated one. Note that several of the imidazole-containing compounds mentioned above are already protonated salts - they, in fact, fit the imidazolium structure drawn, rather than the imidazole one that they're assigned to. This mistake alone makes Figure 2 very problematic indeed. If the paper was, in fact, talking about protonated imidazoles (which, again, is what the authors have drawn) it would be enough to immediately call into question the whole thing, because a protonated imidazole is the same as a regular imidazole when you put it into a buffered system. In fact, if you go through the list, you find that what they're actually talking about are N-alkylimidazoliums, so the structure at the bottom of FIgure 2 is wrong, and misleading. There are two compounds on the list with this signature, in case you were wondering, but the annotation may well be accurate, because some long-chain alkylimidazolium compounds (such as ionic liquid components) are already known to cause mitochondrial depolarization.

But there are several other alkylimidazolium compounds in the set (which is a bit odd, since they're not exactly drug-like). And they're not assigned to the mitochondrial distress phenotype, as Figure 2 would have you think. SGTC_1247, 179, 193, 1991, 327, and 547 all have this moeity, and they scatter between several other categories. Once again, a majority of compounds with the Figure 2 substructure don't actually map to the phenotype shown (while plenty of other structural types do). What use, exactly, is Figure 2 supposed to be?

Let's turn to some other structures in it. The impossible/implausible ones, as mentioned above, turn out to be that way because they're supposed to have substituents on them. But look around - adamantane is on there. To put it as kindly as possible, adamantane itself is not much of a pharmacophore, having nothing going for it but an odd size and shape for grease. Tetrahydrofuran (THF) is on there, too, and similar objections apply. When attempts have been made to rank the sorts of functional groups that are likely to interact with protein binding sites, ethers always come out poorly. THF by itself is not some sort of key structural unit; highlighting it as one here is, for a medicinal chemist, distinctly weird.

What's also weird is when I search for THF-containing compounds that show this activity signature, I can't find much. The only things with a THF ring in them seem to be SGTC_2563 (the complex natural product tomatine) and SGTC_3239, and neither one of them is marked with the signature shown. There are some imbedded THF rings as in the other structural fragments shown (the succinimide-derived Diels-Alder ones), but no other THFs - and as mentioned, it's truly unlikely that the ether is the key thing about these compounds, anyway. If anyone finds another THF compound annotated for tubulin folding, I'll correct this post immediately, but for now, I can't seem to track one down, even though Table S4 says that there are 65 of them. Again, what exactly is Figure 2 supposed to be telling anyone?

Now we come to some even larger concerns. The supplementary material for the paper says that 95% of the compounds on the list are "drug-like" and were filtered by the commercial suppliers to eliminate reactive compounds. They do caution that different people have different cutoffs for this sort of thing, and boy, do they ever. There are many, many compounds in this collection that I would not have bothered putting into a cell assay, for fear of hitting too many things and generating uninterpretable data. Quinone methides are a good example - as mentioned before, they're in this set. Rhodanines and similar scaffolds are well represented, and are well known to hit all over the place. Some of these things are tested at hundreds of micromolar.

I recognize that one aim of a study like this is to stress the cells by any means necessary and see what happens, but even with that in mind, I think fewer nasty compounds could have been used, and might have given cleaner data. The curves seen in the supplementary data are often, well, ugly. See the comments section from the Friday post on that, but I would be wary of interpreting many of them myself.
There's another problem with these compounds, which might very well have also led to the nastiness of the assay curves. As mentioned on Friday, how can anyone expect many of these compounds to actually be soluble at the levels shown? I've shown a selection of them here; I could go on. I just don't see any way that these compounds can be realistically assayed at these levels. Visual inspection of the wells would surely show cloudy gunk all over the place. Again, how are such assays to be interpreted?

And one final point, although it's a big one. Compound purity. Anyone who's ever ordered three thousand compounds from commercial and public collections will know, will be absolutely certain that they will not all be what they say on the label. There will be many colors and consistencies, and LC/MS checks will show many peaks for some of these. There's no way around it; that's how it is when you buy compounds. I can find no evidence in the paper or its supplementary files that any compound purity assays were undertaken at any point. This is not just bad procedure; this is something that would have caused me to reject the paper all by itself had I refereed it. This is yet another sign that no one who's used to dealing with medicinal chemistry worked on this project. No one with any experience would just bung in three thousand compounds like this and report the results as if they're all real. The hits in an assay like this, by the way, are likely to be enriched in crap, making this more of an issue than ever.

Damn it, I hate to be so hard on so many people who did so much work. But wasn't there a chemist anywhere in the room at any point?

Comments (38) + TrackBacks (0) | Category: Biological News | Chemical Biology | Chemical News | The Scientific Literature

April 11, 2014

Biology Maybe Right, Chemistry Ridiculously Wrong

Email This Entry

Posted by Derek

Note: critique of this paper continues here, in another post.

A reader sent along a puzzled note about this paper that's out in Science. It's from a large multicenter team (at least nine departments across the US, Canada, and Europe), and it's an ambitious effort to profile 3250 small molecules in a broad chemogenomics screen in yeast. This set was selected from an earlier 50,000 compounds, since these realiably inhibited the growth of wild-type yeast. They're looking for what they call "chemogenomic fitness signatures", which are derived from screening first against 1100 heterozygous yeast strains, one gene deletion per, representing the yeast essential genome. Then there's a second round of screening against 4800 homozygous deletions strain of non-essential genes, to look for related pathways, compensation, and so on.

All in all, they identified 317 compounds that appear to perturb 121 genes, and many of these annotations are new. Overall, the responses tended to cluster in related groups, and the paper goes into detail about these signatures (and about the outliers, which are naturally interested for their own reasons). Broad pathway effects like mitrochondrial stress show up pretty clearly, for example. And unfortunately, that's all I'm going to say for now about the biology, because we need to talk about the chemistry in this paper. It isn't good.

phenol.pngAs my correspondent (a chemist himself) mentions, a close look at Figure 2 of the paper raises some real questions. Take a look at that cyclohexadiene enamine - can that really be drawn correctly, or isn't it just N-phenylbenzylamine? The problem is, that compound (drawn correctly) shows up elsewhere in Figure 2, hitting a completely different pathway. These two tautomers are not going to have different biological effects, partly because the first one would exist for about two molecular vibrations before it converted to the second. But how could both of them appear on the same figure?

And look at what they're calling "cyclohexa-2,4-dien-1-one". No such compound exists as such in the real world - we call it phenol, and we draw it as an aromatic ring with an OH coming from it. Thiazolidinedione is listed as "thiazolidine-2,4-quinone". Both of these would lead to red "X" marks on an undergraduate exam paper. It is clear that no chemist, not even someone who's been through second-year organic class, was involved in this work (or at the very least, involved in the preparation of Figure 2). Why not? Who reviewed this, anyway?

There are some unusual features from a med-chem standpoint as well. Is THF really targeting tubulin folding? Does adamantane really target ubiquinone biosynthesis? Fine, these are the cellular effects that they noted, I guess. But the weirdest thing on Figure 2's annotations is that imidazole is shown as having one profile, while protonated imidazole is shown as a completely different one. How is this possible? How could anyone who knows any chemistry look at that and not raise an eyebrow? Isn't this assay run in some sort of buffered medium? Don't yeast cells have any buffering capacity of their own? Salts of basic amine drugs are dosed all the time, and they are not considered - ever - as having totally different cellular effects. What a world it would be if that were true! Seeing this sort of thing makes a person wonder about the rest of the paper.

Nitro.pngMore subtle problems emerge when you go to the supplementary material and take a look at the list of compounds. It's a pretty mixed bag. The concentrations used for the assays vary widely - rapamycin gets run at 1 micromolar, while ketoconazole is nearly 1 millimolar. (Can you even run that compound at that concentration? Or that compound at left at 967 micromolar? Is it really soluble in the yeast wells at such levels? There are plenty more that you can wonder about in the same way.

And I went searching for my old friends, the rhodanines, and there they were. Unfortunately, compound SGTC_2454 is 5-benzylidenerhodanine, whose activity is listed as "A dopamine receptor inhibitor" (!). But compound SGTC_1883 is also 5-benzylidenerhodanine, the same compound, run at similar concentration, but this time unannotated. The 5-thienylidenerhodanine is SGTC_30, but that one's listed as a phosphatase inhibitor. Neither of these attributions seem likely to me. There are other duplicates, but many of them are no doubt intentional (run by different parts of the team).

I hate to say this, but just a morning's look at this paper leaves me with little doubt that there are still more strange things buried in the chemistry side of this paper. But since I work for a living (dang it), I'm going to leave it right here, because what I've already noted is more than troubling enough. These mistakes are serious, and call the conclusions of the paper into question: if you can annotate imidazole and its protonated form into two different categories, or annotate two different tautomers (one of which doesn't really exist) into two different categories, what else is wrong, and how much are these annotations worth? And this isn't even the first time that Science has let something like this through. Back in 2010, they published a paper on the "Reactome" that had chemists around the world groaning. How many times does this lesson need to be learned, anyway?

Update: this situation brings up a number of larger issues, such as the divide between chemists and biologists (especially in academia?) and the place of organic chemistry in such high-profile publications (and the place of organic chemists as reviewers of it). I'll defer these to another post, but believe me, they're on my mind.

Update 2 Jake Yeston, deputy editor at Science, tells me that they're looking into this situation. More as I hear it.

Update 3: OK, if Figure 2 is just fragments, structural pieces that were common to compounds that had these signatures, then (1) these are still not acceptable structures, even as fragments, and (2), many of these don't make sense from a medicinal chemistry standpoint. It's bizarre to claim a tetrahydrofuran ring (for example) as the key driver for a class of compounds; the chance that this group is making an actual, persistent interaction with some protein site (or family of sites) is remote indeed. The imidazole/protonated imidazole pair is a good example of this: why on Earth would you pick these two groups to illustrate some chemical tendency? Again, this looks like the work of people who don't really have much chemical knowledge.

0560-0053.pngA closer look at the compounds themselves does not inspire any more confidence. There's one of them from Table S3, which showed a very large difference in IC50 across different yeast strains. It was tested at 400 micromolar. That, folks, was sold to the authors of this paper by ChemDiv, as part of a "drug-like compound" library. Try pulling some SMILES strings from that table yourself and see what you think about their drug likeness.

Comments (129) + TrackBacks (0) | Category: Chemical Biology | Chemical News | The Scientific Literature

April 10, 2014

Encoded Libraries Versus a Protein-Protein Interaction

Email This Entry

Posted by Derek

So here's the GSK paper on applying the DNA-encoded library technology to a protein-protein target. I'm particularly interested in seeing the more exotic techniques applied to hard targets like these, because it looks like there are plenty of them where we're going to need all the help we can get. In this case, they're going after integrin LFA-1. That's a key signaling molecule in leukocyte migration during inflammation, and there was an antibody (Raptiva, efalizumab) on the market, until it was withdrawn for too many side effects. (It dialed down the immune system rather too well). But can you replace an antibody with a small molecule?

A lot of people have tried. This is a pretty well-precedented protein-protein interaction for drug discovery, although (as this paper mentions), most of the screens have been direct PPI ones, and most of the compounds found have been allosteric - they fit into another spot on LFA-1 and disrupt the equilibrium between a low-affinity form and the high-affinity one. In this case, though, the GSK folks used their encoded libraries to screen directly against the LFA-1 protein. As usual, the theoretical number of compounds in the collection was bizarre, about 4 billion compounds (it's the substituted triazine library that they've described before).

An indanyl amino acid in one position on the triazine seemed to be a key SAR point in the resulting screen, and there were at least four other substituents at the next triazine point that kept up its activity. Synthesizing these off the DNA tags gave double-digit nanomolar affinities (if they hadn't, we wouldn't be hearing about this work, I'm pretty sure). Developing the SAR from these seems to have gone in classic med-chem fashion, although a lot of classic med-chem programs would very much like to be able to start off with some 50 nM compounds. The compounds were also potent in cell adhesion assays, with an interesting twist - the team also used a mutated form of LFA-1 where a disulfide holds it fixed in the high-affinity state. The known small-molecule allosteric inhibitors work against wild-type in this cell assay, but wipe out against the locked mutant, as they should. These triazines showed the same behavior; they also target the allosteric site.

That probably shouldn't have come as a surprise. Most protein-protein interactions have limited opportunities for small molecules to affect them, and if there's a known friendly spot like the allosteric site here, you'd have to expect that most of your hits are going to be landing on it. You wonder what might happen if you ran the ELT screen against the high-affinity-locked mutant protein - if it's good enough to work in cells, it should be good enough to serve in a screen for non-allosteric compounds. The answer (most likely) is that you sure wouldn't find any 50 nM leads - I wonder what you'd find at all? Running four billion compounds across a protein surface and finding no real hits would be a sobering experience.

The paper finishes up by showing the synthesis of some fluorescently tagged derivatives, and showing that these also work in cell assay. The last sentence is : "The latter phenomena provided an opportunity for ELT selections against a desired target in its natural state on cell surface. We are currently exploring this technology development opportunity." I wonder if they are? For the same reasons given above, you'd expect to find mostly allosteric binders, and those already seem to be findable. And it's my impression that this is the early-stage ELT stuff (the triazine library), plus, when you look at the list of authors, there are several "Present address" footnotes. So this work was presumably done a while back and is just now coming into the light.

So the question of using this technique against PPI targets remains open, as far as I can tell. This one had already been shown to yield small-molecule hits, and it did so again, in the same binding pocket. What happens when you set out into the unknown? Presumably, GlaxoSmithKline (and the other groups pursuing encoded libraries) know a lot more about than the rest of us do. Surely some screens like this have been run. Either they came up empty - in which case we'll never hear about them - or they actually yielded something interesting, in which case we'll hear about them over the next few years. If you want to know the answer before then, you're going to have to run some yourself. Isn't that always the way?

Comments (17) + TrackBacks (0) | Category: Chemical Biology | Drug Assays

April 9, 2014

The State of Alzheimer's Research, 2014

Email This Entry

Posted by Derek

Via Bernard Munos on Twitter, here's a report from the New York Academy of Sciences looking at the current state of Alzheimer's research. Those various tabs are all live; you can get summaries of each one by clicking.

Looking them over breeds a mixture of hope and despair. The whole thing is themed around the 2025 target that many in the Alzheimer's world have been talking about. And while I understand the need for goals, etc., that year seems way too close. If a promising new compound were to be discovered this afternoon, it wouldn't make it. That brings up another point - many of the speakers at this meeting were talking about moving away from a "compound-centric" point of view. I can see (some of) the point, because there may well be other things to do for Alzheimer's patients. But it's also worth remembering that the reason people are talking like this is that no compounds have worked. This outlook is a second choice driven by necessity, not by some sort of obvious first principle.

And I think that, in the end, Alzheimer's will be arrested by compounds - more than one, most likely, and some of them are quite possibly going to be biomolecules, but compounds all the same. Reading the recommendations about adaptive clinical trials (good idea), broader cooperation and use of common clinical standards (another good idea), and all the others just make me wonder: clinical trials of what? That's the real stumper in this field; where to go next. How to go there is a topic that it's easier to reach agreement on.

Comments (41) + TrackBacks (0) | Category: Alzheimer's Disease

AstraZeneca's Cambridge Move

Email This Entry

Posted by Derek

Here's more on AstraZeneca's move to Cambridge (UK). They've set up an agreement with the Medical Research Council to have MRC people working "alongside" AZ people, although details seem pretty short on how that's going to happen in practice. Here's some of it, though:

Within the AstraZeneca MRC UK Centre for Lead Discovery, the academics will get access to more than 2 million compounds in AstraZeneca's library and have the use of high-tech screening equipment to study diseases and possible treatments.

Their research proposals will be assessed by the MRC, which will fund up to 15 projects a year and AstraZeneca will have the first option to license any resulting drug discovery programs.

I liked this part of the article as well:

Other large drugmakers have built research outposts in life science centers like Cambridge, Boston and San Francisco - but none have undertaken such a wholesale move of operations.

The strategy is not without risks, especially if the upheaval disrupts current research projects or results in key staff leaving the company. A smooth transition is seen as a key test for CEO Soriot as he tries to change the culture at AstraZeneca to put science at the center of its activities.

What was at the center of AZ's operations before?

Comments (28) + TrackBacks (0) | Category: Business and Markets | Business and Markets

April 8, 2014

A Call For Better Mouse Studies

Email This Entry

Posted by Derek

Here's an article by Steve Perrin, at the ALS Therapy Development Institute, and you can tell that he's a pretty frustrated guy. With good reason.
That chart shows why. Those are attempted replicates of putative ALS drugs, and you can see that there's a bit of a discrepancy here and there. One problem is poorly run mouse studies, and the TDI has been trying to do something about that:

After nearly a decade of validation work, the ALS TDI introduced guidelines that should reduce the number of false positives in preclinical studies and so prevent unwarranted clinical trials. The recommendations, which pertain to other diseases too, include: rigorously assessing animals' physical and biochemical traits in terms of human disease; characterizing when disease symptoms and death occur and being alert to unexpected variation; and creating a mathematical model to aid experimental design, including how many mice must be included in a study. It is astonishing how often such straightforward steps are overlooked. It is hard to find a publication, for example, in which a preclinical animal study is backed by statistical models to minimize experimental noise.

All true, and we'd be a lot better off if such recommendations were followed more often. Crappy animal data is far worse than no animal data at all. But the other part of the problem is that the mouse models of ALS aren't very good:

. . .Mouse models expressing a mutant form of the RNA binding protein TDP43 show hallmark features of ALS: loss of motor neurons, protein aggregation and progressive muscle atrophy.

But further study of these mice revealed key differences. In patients (and in established mouse models), paralysis progresses over time. However, we did not observe this progression in TDP43-mutant mice. Measurements of gait and grip strength showed that their muscle deficits were in fact mild, and post-mortem examination found that the animals died not of progressive muscle atrophy, but of acute bowel obstruction caused by deterioration of smooth muscles in the gut. Although the existing TDP43-mutant mice may be useful for studying drugs' effects on certain disease mechanisms, a drug's ability to extend survival would most probably be irrelevant to people.

A big problem is that the recent emphasis on translational research in academia is going to land many labs right into these problems. As the rest of that Nature article shows, the ways for a mouse study to go wrong are many, various, and subtle. If you don't pay very close attention, and have people who know what to pay attention to, you could be wasting time, money, and animals to generate data that will go on to waste still more of all three. I'd strongly urge anyone doing rodent studies, and especially labs that haven't done or commissioned very many of them before, to read up on these issues in detail. It slows things down, true, and it costs money. But there are worse things.

Comments (19) + TrackBacks (0) | Category: Animal Testing | The Central Nervous System

Biotech Boom, Biotech Bust?

Email This Entry

Posted by Derek

Here's a good one by Matthew Herper on "Three Misplaced Assumptions That Could End the Biotech Boom". Given the way the biotech stock index has been performing lately, with a horrendous March and April that's taken it into negative territory for the year to date, something definitely seems to be causing a change of mind.

I'll let you see what Herper's three assumptions are, but I can tell you already that they sound valid to me. I think that his points are particularly relevant to investors who may have been jumping on the stocks in the area without having a clear idea of what the industry is really like. As he says, ". . .investors should avoid thinking that the drug business has undergone a fundamental change in the past few years. It hasn’t." It's the same fun-filled thrill ride as ever!

Comments (2) + TrackBacks (0) | Category: Business and Markets

Can You Patent A Natural Product? Prepare For a Different Answer

Email This Entry

Posted by Derek

So, can you patent naturally occurring substances, or not? That's a rather complicated question, and some recent Supreme Court decisions have recomplicated it in US patent law. Mayo v. Prometheus and Assoc. Mol. Pathology v. Myriad Genetics. The latter, especially, has sent the PTO (and the IP lawyers) back to staring out their respective windows, thinking about what to do next.

The Patent Office has now issued new guidelines for its examiners in light of these rulings, though, and things may be changing. Previous standards for patenting naturally occurring compounds have been tightened up - if I'm reading this correctly, no longer is the process of isolation and purification itself seen as enough of a modification to make a case for patentability. The four "judicial exception" categories, to be used in patentability decisions, are (1) abstract ideas, (2) laws of nature, (3) natural phenomena, and (4) natural products. And examiners are specifically asked to determine if a patent application's claims recite something "significantly different" than these.

Here's the blog of an IP firm that thinks that the USPTO has gone too far:

Now we learn that grant of these and similar patents were mistakes, that 100 years of consistent practice in the field of patents was wrong, that what was invented was no more than products of nature without significant structural difference from the naturally-occurring materials, and that the USPTO will endeavour to avoid such mistakes in future. . .

. . .Whatever workable rule of law is derivable from Prometheus, it is apparent from the opinion of Justice Breyer that it was not the Court’s intention to bring about a radical change in pharmaceutical practice. The opinion gives a warning against undue breadth:

“The Court has recognized, however, that too broad an interpretation of this exclusionary principle could eviscerate patent law. For all inventions at some level embody, use, reflect, rest upon, or apply laws of nature, natural phenomena, or abstract ideas.”

The problem (and it's the usual problem with fresh patent law) is that we really don't know what the phrases in the decisions or guidance mean, in practice, until there's been some practice. This is going to be thrashed out application by application, lawsuit by lawsuit, until some new equilibrium is reached. Right now, though, if you're trying to patent something that could be considered an isolated natural product, your life has become much more complicated and uncertain. Here's another IP law firm:

What is the "significantly different" standard? With respect to natural products, the Guidance offers that what is claimed should be "non-naturally occurring and markedly different in structure from the naturally occurring products". Again, it is unclear at this point how different "markedly different" will be. How different it needs to be will be worked out on a case-by-case basis, beginning at the level of the patent examiner at the USPTO.

So how can you protect your IP if it involves subject matter that could be considered a "product of nature" by a US examiner? Since we don't yet really know how different "markedly different" is, one prudent strategy would be to include multiple claims having varying degrees of modifications relative to the naturally occurring thing, to the extent these makes sense commercially and scientifically. The more different your claimed product is from the naturally occurring thing, the more likely it is to be considered patent eligible by the USPTO.

Comments (19) + TrackBacks (0) | Category: Patents and IP

April 7, 2014

Is Palbociclib Promising? Or Not?

Email This Entry

Posted by Derek

Here's a good test for whatever news outlets you might be using for biotech information. How are they handling Pfizer's release of palbociclib information from the AACR meeting over the weekend?

Do a news search for the drug's name, and you'll see headline after headline. Many of them include the phrase "Promising Results". And from one standpoint, those words are justified. The drug showed a near-doubling in progression-free survival (PFS) when added to the standard of care, and you'd think that that has to be good. But a first analysis of overall survival (OS) shows no statistically significant improvement.

Now, how can that be? One possibility is that the drug helps hold advanced breast cancer back, until a population of cells breaks through - and when they do, it's a very fast-moving bunch indeed. Pfizer, for its part, is certainly hoping that further collection of data will start to show a real OS effect. They're going to need to - Avastin's provisional approval for breast cancer was based on earlier PFS numbers, which did not hold up when OS data came in. And that approval was revoked, as it should have been. Now, Avastin also had side effect issues, and quality-of-life issues, so these cases aren't directly comparable. But the FDA really wants to see a survival benefit, and that's what a new cancer drug really should offer. "You'll die at the same time, but with fewer tumors, and out more money" is not an appealing sales pitch. This issue has come up several times before, with other drugs, and it will come up again.

You'd think that a PFS effect like palbociclib's should translate into a real survival benefit, and as more data are added, it may well. But it's surely not going to be as impressive as people had hoped for, or it would have been apparent in the data we have. So take a look at the stories you're reading on the drug: if they mention this issue, good. If they just talk about what a promising drug for breast cancer palbociclib is, then that reporter (and that news outlet) is not providing the full story. (Here's one that does).

Update: there is an ongoing Phase III that's more specifically looking at overall survival. Its results will be awaited with great interest. . .

Comments (22) + TrackBacks (0) | Category: Cancer | Press Coverage

Outsourcing Everything

Email This Entry

Posted by Derek

Here's an article in Drug Discovery Today on "virtual pharmaceutical companies", and people who've been around the industry for some years must be stifling yawns already. That idea has been around a long time. The authors here defined a "VPC" as one that has a small managerial core, and outsources almost everything else:

The goal of a VPC is to reach fast proof of concept (PoC) at modest cost, which is enabled by the lack of expensive corporate infrastructure to be used for the project and by foregoing activities, such as synthesis optimization, which are unnecessary for the demonstration of PoC. . .The term ‘virtual’ refers to the business model of such a company based on the managerial core, which coordinates all activities with external providers, and on the lack of internal production or development facilities, rather than to the usage of the internet or electronic communication. Any service provider available on the market can be chosen for a project, because almost no internal investments in fixed assets are made.

And by necessity, such a company lives only to make deals with a bigger (non-virtual) company, one that can actually do the clinical trials, manufacturing, regulatory, sales and so on. There's another necessity - such a company has to get pretty nice chemical matter pretty quickly, it seems to me, in order to have something to develop. The longer you go digging through different chemical series and funny-looking SAR, all while doing it with outsourced chemistry and biology, the worse off you're going to be. If things are straightforward, it could work - but when things are straightforward, a lot of stuff can work. The point of having your own scientists (well, one big point) is for them to be able to react in real time to data and make their own decisions on where to go next. The better outsourcing people can do some of that, too, but their costs are not that big a savings, for that very reason. And it's never going to be as nimble as having your own researchers in-house. (If your own people aren't any more nimble than lower-priced contract workers, you have a different problem).

The people actually doing the managing have to be rather competent, too:

All these points suggest that the know-how and abilities of the members of the core management team are central to the success of a VPC, because they are the only ones with the full in-depth knowledge concerning the project. The managers must have strong industrial and academic networks, be decisive and unafraid to pull the plug on unpromising projects. They further need extensive expertise in drug development and clinical trial conduction, proven leadership and project management skills, entrepreneurial spirit and proficiency in handling suppliers. Of course, the crucial dependency on the skills of every single team member leaves little room for mistakes or incompetency, and the survival of a VPC might be endangered if one of its core members resigns unexpectedly

I think that the authors wanted to say "incompetence" rather than "incompetency" up there, but I believe that they're all native German speakers, so no problem. If that had come from some US-based consultants, I would have put it down to the same mental habit that makes people say "utilized" instead of "used". But the point is a good one: the smaller the organization, the less room there is to hide. A really large company can hol (and indeed, tends to accumulate) plenty of people who need the cover.

The paper goes on to detail several different ways that a VPC can work with a larger company. One of the ones I'm most curious about is the example furnished by Chorus and Eli Lilly. Chorus was founded from within Lilly as a do-everything-by-outsourcing team, and over the yeras, Lilly's made a number of glowing statements about how well they've worked out. I have, of course, no inside knowledge on the subject, but at the same time, many other large companies seem to have passed on the opportunity to do the same thing.

I continue to see the "VPC" model as a real option, but only in special situations. When there's a leg up on the chemistry and/or biology (a program abandoned by a larger company for business reasons, an older compound repurposed), then I think it can work. But trying it completely from the ground up seems problematic to me, but that could be because I've always worked in companies with in-house research. And it's true that even the stuff that's going on right down the hall doesn't work out all that often. One response to that is to say "Well, then, why not do the same thing more cheaply?" But another response is "If the odds are bad with your own people under your own roof, what are they when you contract everything out?"

Comments (28) + TrackBacks (0) | Category: Business and Markets | Drug Development

Cancer Immunotherapy's Growing Pains

Email This Entry

Posted by Derek

Cancer immunotherapy, which I've written about several times here (and which has claimed the constant attention of biopharma investors for some time now) has run into an inevitable difficulty: its patients are very sick, and its effects are very strong. Sloan-Kettering announced over the weekend that it's having to halt recruitment in a chimeric antigen receptor (CAR) T-cell trial against non-Hodgkin's lymphoma:

Six patients died of either disease relapse or progression, said MSK, while two patients died in remission from complications related to allogeneic bone marrow transplantation. An additional two patients died within two weeks of receiving a CAR-T cell infusion.

"The first of these two patients had a prior history of cardiac disease and the second patient died due to complications related to persistent seizure activity," noted MSK's presentation. "As a matter of routine review of adverse events on study, our center made a decision to pause enrollment and review these two patients in detail."

This study is associated with Juno Therapeutics, and the company says that it expects to continue once the review is finished. There's a huge amount of activity in this area, with Juno as one of the main players, and Novartis (who are working with the team at Penn) as another. Unfortunately, that activity is both legal and scientific; the patent situation in this area has yet to be clarified. This is an extremely promising approach, but it has a long way to go.

Comments (8) + TrackBacks (0) | Category: Cancer | Clinical Trials

April 4, 2014

GSK Dismisses Employees in Bribery Scandal. Apparently.

Email This Entry

Posted by Derek

Someone is letting it be known that GlaxoSmithKline has fired some of its employees in China in relation to the long-running bribery scandal there. This is one of those times when it's worth asking the "Cui bono?" follow-up question. Is this some sort of semi-authorized release, designed to show other GSK employees that the company is serious? Or to demonstrate the same, publicly, to the Chinese authorities? Or is someone honestly just letting this information out on their own - and if so, why?

Comments (20) + TrackBacks (0) | Category: Business and Markets

Ancient Modeling

Email This Entry

Posted by Derek

I really got a kick out of this picture that Wavefunction put up on Twitter last night. It's from a 1981 article in Fortune, and you'll just have to see the quality of the computer graphics to really appreciate it.

That sort of thing has hurt computer-aided drug design a vast amount over the years. It's safe to say that in 1981, Merck scientists did not (as the article asserts) "design drugs and check out their properties without leaving their consoles". It's 2014 and we can't do it like that yet. Whoever wrote that article, though, picked those ideas up from the people at Merck, with their fuzzy black-and-white monitor shots of DNA from three angles. (An old Evans and Sutherland terminal?) And who knows, some of the Merck folks may have even believed that they were close to doing it.

But computational power, for the most part, only helps you out when you already know how to calculate something. Then it does it for you faster. And when people are impressed (as they should be) with all that processing power can do for us now, from smart phones on up, they should still realize that these things are examples of fast, smooth, well-optimized versions of things that we know how to calculate. You could write down everything that's going on inside a smart phone with pencil and paper, and show exactly what it's working out when it display this pixel here, that pixel there, this call to that subroutine, which calculates the value for that parameter over there as the screen responds to the presence of your finger, and so on. It would be wildly tedious, but you could do it, given time. Someone, after all, had to program all that stuff, and programming steps can be written down.

The programs that drove those old DNA pictures could be written down, too, of course, and in a lot less space. But while the values for which pixels to light up on the CRT display were calculated exactly, the calculations behind those were (and are) a different matter. A very precise-looking picture can be drawn and animated of an animal that does not exist, and there are a lot of ways to draw animals that do not exist. The horse on your screen might look exact in every detail, except with a paisley hide and purple hooves (my daughter would gladly pay to ride one). Or it might have a platypus bill instead of a muzzle. Or look just like a horse from outside, but actually be filled with helium, because your program doesn't know how to handle horse innards. You get the idea.

The same for DNA, or a protein target. In 1981, figuring out exactly what happened as a transcription factor approached a section of DNA was not possible. Not to the degree that a drug designer would need. The changing conformation of the protein as it approaches the electrostatic field of the charged phosphate residues, what to do with the water molecules between the two as they come closer, the first binding event (what is it?) between the transcription factor and the double helix, leading to a cascade of tradeoffs between entropy and enthalpy as the two biomolecules adjust to each other in an intricate tandem dance down to a lower energy state. . .that stuff is hard. It's still hard. We don't know how to model some of those things well enough, and the (as yet unavoidable) errors and uncertainties in each step accumulate the further you go along. We're much better at it than we used to be, and getting better all the time, but there's a good way to go yet.

But while all that's true, I'm almost certainly reading too much into that old picture. The folks at Merck probably just put one of their more impressive-looking things up on the screen for the Fortune reporter, and hey, everyone's heard of DNA. I really don't think that anyone at Merck was targeting protein-DNA interactions 33 years ago (and if they were, they splintered their lance against that one, big-time). But the reporter came away with the impression that the age of computer-designed drugs was at hand, and in the years since, plenty of other people have seen progressively snazzier graphics and thought the same thing. And it's hurt the cause of modeling for them to think that, because the higher the expectations get, the harder it is to come back to reality.

Update: I had this originally as coming from a Forbes article; it was actually in Fortune.

Comments (22) + TrackBacks (0) | Category: Drug Industry History | In Silico

April 3, 2014

More Fukuyama Corrections

Email This Entry

Posted by Derek

The Fukuyama group has another series of corrections out, this time in JACS. Here's one, and the other follow right behind it in the ASAP queue. This adds to the string of them in Organic Letters. It's more whiteout stuff - vanishing solvent peaks and impurities. These presumably don't affect the conclusions of the paper, but they don't make a person any more confident, either. One hopes that these high-profile cases will shake people up. . .

Comments (24) + TrackBacks (0) | Category: The Scientific Literature

Reality-Based Biotech Investing

Email This Entry

Posted by Derek

David Sable has some useful rules for investing biotech stocks (more here). On the surface, many of these may look more applicable to people who are managing larger amounts of money, because he's talking about what to do (and not do) when you're walking around the JP Morgan healthcare conference, and so on. But the lessons behind his advice are sound for everyone - for example:

". . .stop looking for code words, Groucho Marx eyebrow raising, or any other type of "body language" silliness from insiders."

The corollary to that is that if you're thinking about investing in a small company that acts as if it's doing this sort of thing, or has been touted to you on the basis of such, turn around and look somewhere else. (Even worse, if you find yourself working for a company like this, you'd better start making plans). This is a sign of what I think of as the "professional wrestling" school of investing - it's the world of the people who see the market as a titanic battle between Good and Evil, the Good being the people who own the wonderful company's stock, and the Evil, naturally, being the Evil Shorts and Paid Bashers. As with other forms of conspiratorial thinking, it's easy for someone with this attitude to dismiss good advice (if exposed to same) by saying that the person offering it is naive - not clued in, wised up, or verb-prepositioned in general. If you knew how the world really works, you'd realize that the recent moves in the stock are all so transparent - it's the money managers, you see, who are trying to shake the shares from the weak hands so they can accumulate it in front of the Big Announcement.

The world doesn't work that way, I think, or not at the retail market level, at any rate. It's not a show, and there's no script. Many people investing in small biotech stocks have a reality-TV view of the world, when reality would serve them far better.

Comments (9) + TrackBacks (0) | Category: Business and Markets

April 2, 2014

Binding Assays, Inside the Actual Cells

Email This Entry

Posted by Derek

Many readers will be familiar, at least in principle, with the "thermal shift assay". It goes by other names as well, but the principle is the same. The idea is that when a ligand binds to a protein, it stabilizes its structure to some degree. This gets measured by watching its behavior as samples of bound and unbound proteins are heated up, and the most common way to detect those changes in protein structure (and stability) is by using a fluorescent dye. Thus another common name for the assay, DSF, for Differential Scanning Fluorimetry. The dye has a better chance to bind to the newly denatured protein once the heat gets to that point, and that binding even can be detected by increasing fluorescence. The assay is popular, since it doesn't require much in specialized equipment and is pretty straightforward to set up, compared to something like SPR. Here's a nice slide presentation that's up on the web from UC Santa Cruz, and here's one of many articles on using the technique for screening.

I bring this up because of this paper last suumer in Science, detailing what the authors (a mixed team from Sweden and Singapore) called CETSA, the cellular thermal shift assay. They trying to do something that is very worthwhile indeed: measuring ligand binding inside living cells. Someone who's never done drug discovery might imagine that that's the sort of thing that we do all the time, but in reality, it's very tricky. You can measure ligand binding to an isolated protein in vitro any number of ways (although they may or may not give you the same answer!), and you can measure downstream effects that you can be more (or less) confident are the result of your compound binding to a cellular target. But direct binding measurements in a living cell are pretty uncommon.

I wish they weren't. Your protein of interest is going to be a different beast when it's on the job in its native environment, compared to sitting around in a well in some buffer solution. There are other proteins for it to interact with, a whole local environment that we don't know enough to replicate. There are modifications to its structure (phosphorylation and others) that you may or may not be aware of, which can change things around. And all of these have a temporal dimension, changing under different cellular states and stresses in ways that are usually flat-out impossible to replicate ex vivo.

Here's what this new paper proposes:

We have developed a process in which multiple aliquots of cell lysate were heated to different temperatures. After cooling, the samples were centrifuged to separate soluble fractions from precipitated proteins. We then quantified the presence of the target protein in the soluble fraction by Western blotting . . .

Surprisingly, when we evaluated the thermal melt curve of four different clinical drug targets in lysates from cultured mammalian cells, all target proteins showed distinct melting curves. When drugs known to bind to these proteins were added to the cell lysates, obvious shifts in the melting curves were detected. . .

That makes it sound like the experiments were all done after the cells were lysed, which wouldn't be that much of a difference from the existing thermal shift assays. But reading on, they then did this experiment with methotrexate and its enzyme target, dihydrofolate reductase (DHFR), along with ralitrexed and its target, thymidylate synthase:

DHFR and TS were used to determine whether CETSA could be used in intact cells as well as in lysates. Cells were exposed to either methotrexate or raltitrexed, washed, heated to different temperatures, cooled, and lysed. The cell lysates were cleared by centrifugation, and the levels of soluble target protein were measured, revealing large thermal shifts for DHFR and TS in treated cells as compared to controls. . .

So the thermal shift part of the experiment is being done inside the cells themselves, and the readout is the amount of non-denatured protein left after lysis and gel purification. That's ingenious, but it's also the sort of idea that (if it did occur to you) you might dismiss as "probably not going to work" and/or "has surely already been tried and didn't work". It's to this team's credit that they ran with it. This proves once again the soundness of Francis Crick's advice (in his memoir What Mad Pursuitand other places) to not pay too much attention to your own reasoning about how your ideas must be flawed. Run the experiment and see.

A number of interesting controls were run. Cell membranes seem to be intact during the heating process, to take care of one big worry. The effect of ralitrexed added to lysate was much greater than when it was added to intact cells, suggesting transport and cell penetration effects. A time course experiment showed that it took two to three hours to saturate the system with the drug. Running the same experiment on starved cells gave a lower effect, and all of these point towards the technique doing what it's supposed to be doing - measuring the effect of drug action in living cells under real-world conditions.

There's even an extension to whole animals, albeit with a covalent compound, the MetAP2 inhibitor TNP-470. It's a fumagillin derivative, so it's a diepoxide to start off, with an extra chloroacetamide for good measure. (You don't need that last reactive group, by the way, as Zafgen's MetAP2 compound demonstrates). The covalency gives you every chance to see the effect if it's going to be seen. Dosing mice with the compound, followed by organ harvesting, cell lysis, and heating after the lysis step showed that it was indeed detectable by thermal shift after isolation of the enzyme, in a dose-responsive manner, and that there was more of it in the kidneys than the liver.

Back in the regular assay, they show several examples of this working on other enzymes, but a particularly good one is PARP. Readers may recall the example of iniparib, which was taken into the clinic as a PARP-1 inhibitor, failed miserably, and was later shown not to really be hitting the target at all in actual cells and animals, as opposed to in vitro assays. CETSA experiments on it versus olaparib, which really does work via PARP-1, confirm this dramatically, and suggest that this assay could have told everyone a long time ago that there was something funny about iniparib in cells. (I should note that PARP has also been a testbed for other interesting cell assay techniques).

This leads to a few thoughts on larger questions. Sanofi went ahead with iniparib because it worked in their assays - turns out it just wasn't working through PARP inhibition, but probably by messing around with various cysteines. They were doing a phenotypic program without knowing it. This CETSA technique is, of course, completely target-directed, unless you feel like doing thermal shift measurements on a few hundred (or few thousand) proteins. But that makes me wonder if that's something that could be done. Is there some way to, say, impregnate the gel with the fluorescent shift dye and measure changes band by band? Probably not (the gel would melt, for one thing), but I (or someone) should listen to Francis Crick and try some variation on this.

I do have one worry. In my experience, thermal shift assays have not been all that useful. But I'm probably looking at a sampling bias, because (1) this technique is often used for screening fragments, where the potencies are not very impressive, and (2) it's often broken out to be used on tricky targets that no one can figure out how to assay any other way. Neither of those are conducive to seeing strong effects; if I'd been doing it on CDK4 or something, I might have a better opinion.

With that in mind, though, I find the whole CETSA idea very interesting, and well worth following up on. Time to look for a chance to try it out!

Comments (34) + TrackBacks (0) | Category: Chemical Biology | Drug Assays

April 1, 2014

Freeman Dyson on the PhD Degree

Email This Entry

Posted by Derek

From this interview:

"Oh, yes. I’m very proud of not having a Ph.D. I think the Ph.D. system is an abomination. It was invented as a system for educating German professors in the 19th century, and it works well under those conditions. It’s good for a very small number of people who are going to spend their lives being professors. But it has become now a kind of union card that you have to have in order to have a job, whether it’s being a professor or other things, and it’s quite inappropriate for that. It forces people to waste years and years of their lives sort of pretending to do research for which they’re not at all well-suited. In the end, they have this piece of paper which says they’re qualified, but it really doesn’t mean anything. The Ph.D. takes far too long and discourages women from becoming scientists, which I consider a great tragedy. So I have opposed it all my life without any success at all. . ."

Comments (68) + TrackBacks (0) | Category: General Scientific News

Off To the Publishers

Email This Entry

Posted by Derek

I don't know if my publisher was pulling my leg by having the deadline for the manuscript of "The Chemistry Book" be April 1, but that's what the contract says. And I've sent the thing off, so it's now in the hands of the editors. There's more to be done - I have some more dates to track down, for one, and I'd like to insert some more references for further reading. And then there are the illustrations, for which I've sent along many suggestions, and I'll need to write the captions for those once we've settled on what pictures to use. But the bulk writing is done, I'm glad to say.

Comments (13) + TrackBacks (0) | Category: Blog Housekeeping

Yeah, That Must Be It

Email This Entry

Posted by Derek

I'd sort of suspected this um, breakthrough, in catalysis that See Arr Oh is reporting. But how come more of my reactions don't work, eh? 'Cause there's been all kinds of crud in them, I feel pretty sure. Maybe the various crud subtypes (cruddotypes?) are canceling each other out. . .

Comments (21) + TrackBacks (0) | Category: Chemical News

March 31, 2014

Where The Hot Drugs Come From: Somewhere Else

Email This Entry

Posted by Derek

Over at LifeSciVC, there's a useful look at how many drugs are coming into the larger companies via outside deals. As you might have guessed, the answer is "a lot". Looking at a Goldman Sachs list of "ten drugs that could transform the industry", Bruce Booth says:

By my quick review, it appears as though ~75% of these drugs originated at firms different from the company that owns them today (or owns most of the asset today) – either via in-licensing deal or via corporate acquisitions. Savvy business and corporate development strategies drove the bulk of the list. . .I suspect that in a review of the entire late stage industry pipeline, the imbalanced ratio of external:internal sourcing would largely be intact.

He has details on the ten drugs that Goldman is listing, and on the portfolios of several of the big outfits in the industry, and I think he's right. It would be very instructive to know what the failure rate, industry-wide, of inlicensed compounds like this might be. My guess is that it's still high, but not quite as high as the average for all programs. The inlicensed compounds have had, in theory, more than one set of eyes go over them, and someone had to reach into their wallet after seeing the data, so you'd think that they have to be in a little bit better shape. But a majority still surely fail, given that the industry's rate overall is close to 90% clinical failure (the math doesn't add up if you try to assume that the inlicensed failure rate is too low!)

Also of great interest is the "transformational" aspect. We can assume, I think, that most of the inlicensed compounds came from smaller companies - that's certainly how it looks on Bruce's list. This analysis suggested that smaller companies (and university-derived work) produced more innovative drugs than internal big-company programs, and these numbers might well be telling us the same thing.

This topic came up the last time I discussed a post from Bruce, and Bernard Munos suggested in 2009 that this might be the case as well. It's too simplistic to just say Small Companies Good, Big Companies Bad, because there are some real counterexamples to both of those assertions. But overall, averaged over the industry, there might be something to it.

Comments (26) + TrackBacks (0) | Category: Business and Markets | Drug Industry History

A Quick Clean-Up

Email This Entry

Posted by Derek

Well, while I wasn't watching over the weekend, the comments section to this post kind of veered off the road. I've deleted a number of trolling comments, and all the various replies to them, and further comments to that entry are now closed. I rarely do this sort of thing, but (ironically) I was just saying the other evening that pretty much the only time I delete comments is when they're nothing but ad hominem. There are plenty of other places on the web to trade insults and gibberish (some sites specialize in nothing but), so I don't think it's any great loss to the world if this site doesn't join in. We'll now resume our regularly scheduled programming.

Comments (11) + TrackBacks (0) | Category: Blog Housekeeping

March 28, 2014

More on the UT-Austin Retraction Case

Email This Entry

Posted by Derek

I mentioned an unusual retraction from Organic Letters here last year, and here's some follow-up to the story:

Nearly six years after Suvi Orr received a Ph.D. in chemistry from the University of Texas, the university told her it has decided to do something that institutions of higher learning almost never do: revoke
the degree. Orr, in turn, has sued UT in an effort to hold onto the doctorate that launched her career in the pharmaceutical industry.

Her lawsuit in state district court in Travis County contends that revocation is unwarranted and that the university violated her rights by not letting her defend herself before the dissertation committee that condemned her research long after she graduated. In addition, she says, the committee relied heavily on her former professor, who, she claims, was motivated to “cast the blame elsewhere.”

What a mess. More details as things develop. . .

Comments (17) + TrackBacks (0) | Category: The Dark Side | The Scientific Literature

A Huntington's Breakthrough?

Email This Entry

Posted by Derek

Huntington's is a terrible disease. It's the perfect example of how genomics can only take you so far. We've known since 1993 what the gene is that's mutated in the disease, and we know the protein that it codes for (Huntingtin). We even know what seems to be wrong with the protein - it has a repeating chain of glutamines on one end. If your tail of glutamines is less than about 35 repeats, then you're not going to get the disease. If you have 36 to 39 repeats, you are in trouble, and may very well come down with the less severe end of Huntington's. If there are 40 or more, doubt is tragically removed.

So we can tell, with great precision, if someone is going to come down with Huntington's, but we can't do a damn thing about it. That's because despite a great deal of work, we don't really understand the molecular mechanism at work. This mutated gene codes for this defective protein, but we don't know what it is about that protein that causes particular regions of the brain to deteriorate. No one knows what all of Huntingtin's functions are, and not for lack of trying, and multiple attempts to map out its interactions (and determine how they're altered by a too-long N-terminal glutamine tail) have not given a definite answer.

But maybe, as of this week, that's changed. Solomon Snyder's group at Johns Hopkins has a paper out in Nature that suggests an actual mechanism. They believe that mutant Huntingtin binds (inappropriately) a transcription factor called "specificity protein 1", which is known to be a major player in neurons. Among other things, it's responsible for initiating transcription of the gene for an enzyme called cystathionine γ-lyase. That, in turn, is responsible for the last step in cysteine biosynthesis, and put together, all this suggests a brain-specific depletion of cysteine. Update: this could have numerous downstream consequences - this is the pathway that produces hydrogen sulfide, which the Snyder group has shown is an important neurotransmitter (one of several they've discovered), and it's also involved in synthesizing glutathione. Cysteine itself is, of course, often a crucial amino acid in many protein structures as well.)

Snyder is proposing this as the actual mechanism of Huntington's, and they have shown, in human tissue culture and in mouse models of the disease, that supplementation with extra cysteine can stop or reverse the cellular signs of the disease. This is a very plausible theory (it seems to me), and the paper makes a very strong case for it. It should lead to immediate consequences in the clinic, and in the labs researching possible therapies for the disease. And one hopes that it will lead to immediate consequences for Huntington's patients themselves. If I knew someone with the Huntingtin mutation, I believe that I would tell them to waste no time taking cysteine supplements, in the hopes that some of it will reach the brain.

Comments (20) + TrackBacks (0) | Category: Biological News | The Central Nervous System

March 27, 2014

A Look Back at Big Pharma Stocks

Email This Entry

Posted by Derek

Four years ago, I wrote about what I called "Big Pharma's Lost Decade" in the stock market. I thought it would be worth revisiting that, with some different time points.

At the top is the performance of those same big drug companies since I wrote that blog post. Note that Bristol-Myers Squibb has been the place to be during that period (lots of excitement around their oncology pipeline, for one thing). Pfizer has beaten the S&P index over that time as well. And they've done it while paying a higher dividend than the aggregate S&P, too, of course - I'd like to find a way to include dividends into charts like these for an even more real-world comparison. Everyone else is behind.

The next chart shows a ten-year time frame. Bristol-Myers Squibb is still on top, although you'll note that the overall gain is basically the same as the gain since 2010 (that is, it's all come since then). And now J&J is right behind them, and they're the only two whose stock prices have beaten the S&P index over this period. Note that Pfizer and Lilly are actually down from this time point.

Then we have performance since 2000, the twenty-first century chart. Since this was during the Crazy Years in the market, just about everyone is down when measured from here, except for J&J (which is at about the same gain as if you'd started in 2004). The most dramatic mover is Bristol Myers-Squibb - if you bought in at the start of that last chart, you're up 109%. If you bought in at the start of this one, you're down 21%.

And that brings us to the last chart, which is basically "Since I started working in the drug industry". I'd been on the case for about three months by the end of 1990, which is where this one starts. And there are many interesting things to note - first among them, what a big, big deal the latter half of the 1990s were in the stock market. And more specifically, what a big, big deal they were for Pfizer's stock. Holy mackerel, will you look at that chart - compared to the rest of the industry, Pfizer's stock was an absolute monster, and there you have a big driver for all of the company's merger-rific behavior during that period. It paid. Not so much in research results, of course, but it paid the shareholders, and it paid whoever had lots of PFE stock and options. (And it paid the firms on the Street who did the deals with them, too, but that's always the case for them). A really long-term Pfizer shareholder can't be upset at all with the company's performance versus the S&P over that time period. How many have held it, though?

But the other thing to note is J&J. There they are again - it's only in that first chart that they're lagging. Longer-term, they just keep banging away. That, one would have to assume, is at least partly because they've got all those other medical-related businesses keeping them grounded during the whole time. Back when I worked for Bayer, at the Wonder Drug Factory, analysts were forever banging on about how the company just had to, had to break up. Outdated conglomerate model, holding everyone back. So much hotness waiting to be released. But Bayer hasn't been holding up too badly, either, and Bernard Munos has some things to say about both them and J&J.

It is not a good idea (to "undiversify") because, at the moment, we do not have good tools to mitigate risk in drug R&D, which is a problem at the macroeconomic level, because capital does not flow to this industry as it should. Too many investors have been burned too badly and are now investing elsewhere or sitting on the fence, so we need to somehow get better at that. . .we've got to live with the situation where risk in the pharmaceutical industry cannot really be mitigated adequately. You can do portfolio management. Every company has done portfolio management. It has failed miserably across the board. That was supposed to protect everybody against patent cliffs, and everybody has fallen down patent cliffs, so clearly portfolio management has not worked.

Mind you, "undiversifying" is exactly what Pfizer is trying right now. They're not only trying to undo some of the gigantism of all those mergers, they're shedding whatever they have that is Not Pharma. So they're running that experiment for us, as they have some others over the years. . .

Comments (14) + TrackBacks (0) | Category: Business and Markets

Another Target Validation Effort

Email This Entry

Posted by Derek

Here's another target validation initiative, with GSK, the EMBL, and the Sanger Institute joining forces. It's the Centre for Therapeutic Target Validation (CCTV):

CTTV scientists will combine their expertise to explore and interpret large volumes of data from genomics, proteomics, chemistry and disease biology. The new approach will complement existing methods of target validation, including analysis of published research on known biological processes, preclinical animal modelling and studying disease epidemiology. . .

This new collaboration draws on the diverse, specialised skills from scientific institutes and the pharmaceutical industry. Scientists from the Wellcome Trust Sanger Institute will contribute their unique understanding of the role of genetics in health and disease and EMBL-EBI, a global leader in the analysis and dissemination of biological data, will provide bioinformatics-led insights on the data and use its capabilities to integrate huge streams of different varieties of experimental data. GSK will contribute expertise in disease biology, translational medicine and drug discovery.

That's about as much detail as one could expect for now. It's hard to tell what sorts of targets they'll be working on, and by "what sorts" I mean what disease areas, what stage of knowledge, what provenance, and everything else. But the press release goes on to say that the information gathered by this effort will be open to the rest of the scientific community, which I applaud, and that should give us a chance to look under the hood a bit.

It's hard for me to say anything bad about such an effort, other than wishing it done on a larger scale. I was about to say "other than wishing it ten times larger", but I think I'd rather have nine other independent efforts set up than making this one huge, for several reasons. Quis validet ipsos validares, if that's a Latin verb and I haven't mangled it: Who will validate the validators? There's enough trickiness and uncertainty in this stuff for plenty more people to join in.

Comments (11) + TrackBacks (0) | Category: Biological News | Drug Assays

Dichloroacetic Acid, In a New Form

Email This Entry

Posted by Derek

Remember dichloroacetic acid? In 2007, there was a stir about it as a cancer therapy, and on internet forums you still see it referenced as a "cancer cure" that no drug company will touch because it's unpatentable/doesn't have to be taken forever/too cheap/not evil enough, etc.

The people spreading that stuff around don't know how to use PubMed, because a look through the literature will show that DCA is still an active area of research (in some cases, involving people who've taken it on their own). Interestingly, PubMed also makes it apparent that the rest of the literature on the compound is in its role as a water pollutant. But the problem with it as a drug is that it has poor pharmacokinetics. Its site of action is the mitochondrion, but it doesn't do a very good job of getting there (as one would expect from a small molecular weight carboxylic acid, especially one that's as ionized as this one is at body pH).

So here's an attempt to do something about that. The authors, from the University of Georgia, tether several DCA molecules to a scaffold that should do a better job of targeting mitochondria. They go as far as cellular data to prove the point, but there's nothing in vivo (I'm not sure what would happen in that case, but it would seem worth finding out).

This, one should note, is a new molecule, and one that was perfectly capable of being patented - it has novelty, and it apparently has more utility for its stated purpose. Every time you hear about how Evil Pharma won't work on X, or Y, or Z, because "they can't patent it", keep in mind that we here at Evil Pharma know a lot of ways to patent things. Part of what makes us so darn evil, you know.

Comments (18) + TrackBacks (0) | Category: Cancer

March 26, 2014

A New Fluorination

Email This Entry

Posted by Derek

BrittonF.pngNew fluorination reactions are always welcome, and there's one out in Ang. Chem. that looks really interesting. Robert Britton's group at Simon Fraser University report using tetrabutylammonium decatungstate as a photochemistry catalyst with N-fluorobenzenesulfonimide (NFSI). This system fluorinates unsubstituted alkanes, as shown at left, and apparently tolerates several functional groups in the process.

Note that the amino acids were fluorinated as their hydrochloride salts; the free bases didn't work. There aren't any secondary or tertiary amine substrates in the paper, nor are there any heterocycles, both of which are cause to wonder whenever you see a new fluorination method. But I think I'm going to order up some tungstate, turn on the lamp, and see what I get.

Update: via Chemjobber, here's an excellent process chemistry look at scaling up a trifluoromethylation reaction.

Comments (16) + TrackBacks (0) | Category: Chemical News

Getcher Nucleic Acids, Cheap

Email This Entry

Posted by Derek

Via Nathaniel Comfort on Twitter, I note that the health-food people are still selling "DNA supplements". I remember seeing these in a vitamin store some years ago, and wrinkling my brow as I thought about the implications. Does your food have enough DNA in it? Actually, these pills turn out to be 100mg of RNA and only 10mg of DNA, so you might want to adjust your dosages accordingly.

Turns out that the only negative review on the actual site is from someone who's upset that there's so much filler in the pills themselves. More DNA is what he wants. He should try what another guy further down the page does, and swallow five of the things at a time. It gives him "energy", y'know, and he's not alone. Every one of these satisfied customers has felt the energy, and some of them even have picked up a healthy glow to their skin. So there you have it. I thought that peanut M&Ms gave me energy (although maybe not the healthy glow), but I should clearly start snacking on RNA instead.

When I called my wife with this news, her first comment was "RNA from what?" I countered that a whole bottle of pills was only $4.99, and this was (brace yourselves) fifty per cent off the usual price. (In the reviews, one customer found this price very "exceptable"). Anyway, I said, this was not the time to be looking under the hood of such an opportunity. "And how much is shipping?" she wanted to know. I replied that I'm really not sure how I'm still married to her, what with that suspicious nature and all. I tell you.

Comments (31) + TrackBacks (0) | Category: Snake Oil