Boston edges out Cambridge for new technology firms

In a somewhat surprising development, Boston actually had more venture capital deals than Cambridge in 2013.  Ninety-seven deals were signed in Boston, vs. 78 in Cambridge.  Cambridge does still hold a commanding lead in the dollar amount, however, with total funding of $820 million (down $105m from 2012), vs. $593 for Boston (up $85m from 2012).  The entire Boston metro area totaled over $3 billion in funding (approximately the same as in 2012).

As reported in the Boston Globe, Boston seems to be attracting more of the very small and very early stage firms, while Kendall Square in Cambridge and Waltham in the suburbs have matured somewhat, with far more mid-sized and large established companies.  In contrast, really small startups and associated venture capital firms alike are increasingly finding homes in Boston’s downtown Innovation, Financial and Leather Districts.  Anecdotally, it’s a younger crowd being attracted to downtown.

“There are more subway lines, and it’s a little hipper in the downtown in many ways because there’s better food”

Besides the tasty food, another big factor is the proximity affect.  One VC firm, NextView Ventures, is an investor in 9 companies that are all within walking distance from its offices in the Leather district.  Of course Cambridge and Boston are so close that you can almost walk from one to the other during your lunch hour, so parsing out Cambridge vs. Boston’s lead in VC funding is a bit of hair-splitting.  What I can say is that on a clear day, I can see a surprising number of construction cranes in both cities.

Posted in Innovation | Tagged , , , | Leave a comment

Peter Drucker on Outsourcing and Information Technology

I stumbled on a 2004 interview of Peter Drucker, one of the giants of management consulting and education.  Drucker was famous for coining the term “knowledge worker”, and he was an early proponent of outsourcing (starting decades ago).  It’s worth reading the interview in its entirety, but here are a couple of excerpts that resonated with me, now, in 2014.  The first one is on the true value of outsourcing.  Hint: it’s not cost savings.

Q: How can the productivity of knowledge workers be measured and improved?

A: Nobody has really looked at productivity in white-collar work in a scientific way. But whenever we do look at it, it is grotesquely unproductive. As you know, most of my work these days is with universities and hospitals and churches, which are three of the biggest knowledge-worker employers, and their productivity is dismal. In part this is because knowledge work by definition is highly specialized, and that means that the utilization of the knowledge worker tends to be very low.

The inefficiency of knowledge workers is partly the legacy of the 19th-century belief that a modern company tries to do everything for itself. Now, thank God, we’ve discovered outsourcing, but I would also say we don’t yet really know how to do outsourcing well. Most look at outsourcing from the point of view of cutting costs, which I think is a delusion. What outsourcing does is greatly improve the quality of the people who still work for you. I believe you should outsource everything for which there is no career track that could lead into senior management. When you outsource to a total-quality-control specialist, he is busy 48 weeks a year working for you and a number of other clients on something he sees as challenging. Whereas a total-quality-control person employed by the company is busy six weeks a year and the rest of the time is writing memoranda and looking for projects. That’s why when you outsource you may actually increase costs, but you also get better effectiveness.

Here’s another comment on information technology:

Information technology forces you to organize your processes more logically. The computer can handle only things to which the answer is yes or no. It cannot handle maybe. It’s not the computerization that’s important, then; it’s the discipline you have to bring to your processes. You have to do your thinking before you computerize it or else the computer simply goes on strike.

This enforced discipline has some disadvantages, because it often forces people to oversimplify. Also, the process of arriving at business decisions isn’t always systematic enough to be supported by computers. You have to take the assumptions out of the mind of the decision-maker and put them explicitly into the process, along with a method to check them, and only then can a computer help you manage it. Older executives find it excruciating to have to be that explicit, because they just don’t want to be. Besides, as we all know, many decisions are ultimately made by the hydrostatic pressure in the boss’s bladder.

As I said, read the whole thing.  It’s still very relevant, 10 years later.

 

Posted in Management | Tagged , , | Leave a comment

Juno

Museum of Fine Arts, Boston

20140328-105314.jpg

Posted in Uncategorized | Leave a comment

Spectacular Night Sky as Winter Turns to Spring

As Winter turned to Spring here in Eastern Massachusetts, tonight’s sky was spectacular (the image above is a fairly good approximation).  At about 9 o’clock facing West, there was an amazing view of Orion looming in the middle of the sky, with Canis Major following close on his heels.  Up and to the right was Taurus, and further on, the Pleiades.  Directly above Orion, Jupiter shone brightly in the middle of Gemini.  Over my left shoulder was Leo, and directly behind me was the Big Dipper (or Ursa Major, if you prefer).  The stars were especially clear and bright with a dark background sky.  Through a 10×50 binoculars, the Pleiades cluster looked like 7 scattered diamonds on sparkling velvet.  The individual stars Betelgeuse and Aldebaran were bright orange, and Sirius and Rigel cold blue.  Orion’s belt and sword, including the Orion Nebula, were crystal clear (though not nearly as colorful as in all the photographs you see online).  Even through my binoculars, I could make out two of Jupiter’s moons, and when I got out my 70mm telescope Calisto, Ganymede, and Io stretched out like little jewels on both sides of the planet’s striped disk.  Apparently, Europa was in front of Jupiter at the time:

Screen Shot 2014-03-20 at 11.56.52 PMAll in all, an amazing show for our light-polluted sky!

If all that sounded like a lot of Greek (or Roman, or Arabic) to you, try reading H.A. Rey’s
Find the Constellations.  It’s great place to start learning about the stars.  No telescope or binoculars required.

Posted in Fun Science, Space & Astronomy | Tagged , | 1 Comment

Moore’s Law for Space-Based Imaging

Hundreds of eyes in the sky, just like this one. Getting better, cheaper, and more numerous every year.

Planet Labs Dove

Imagine a network of satellites, all taking daily pictures of the earth beneath them and reporting that data back to earth in real time.  Owned by the CIA? DOD? NSA?  Nope.  Silicon Valley startup Planet Labs, the latest disruptive innovator in space will have that many small imaging satellites in orbit by the end of this year:

But that is just the start. Last week, Planet Labs announced that it would put about 100 satellites into space from the United States and Russia, bringing the total number of “Doves,” as the company calls them, to 131. That larger network, which Planet Labs hopes to complete within a year, is expected to create a daily photo mosaic of most of Earth.

That mosaic could be valuable to private customers, like agricultural companies monitoring farmlands, or even to governments trying to figure out how to aid natural disaster victims. The company has so far booked contracts worth more than the $65 million in private equity it has raised, according to Will Marshall, the company’s co-founder and chief executive.

Like many disruptive innovations, these satellites (built from mobile phone components!) aren’t as good as full scale ones costing hundreds of millions of dollars, but they cost a tiny fraction of that.  And they’re getting better, and cheaper — really quickly:

By making little machines that are often updated, Mr. Gillmore said, “we’re building satellites with computers that are six months old. Lots of satellites have 10-year-old computers.” Version nine, which is almost complete, cost about 35 percent less than the current version in space, and was made four times faster, he estimated.

And that really is just the start.  The company is planning to add more and more new and improved versions of these satellites over time.  Think of a Moore’s Law for space-based imaging.  More coverage area, more images per day, improved resolution, perhaps observations at additional wavelengths besides visible light.  Real time crop imaging, firefighting, climate monitoring, ecological studies, global security (or insecurity)… What applications might exist that haven’t even yet been conceived of?

Posted in Innovation, Space & Astronomy | Tagged , , , | Leave a comment

More Biotech Hub Rankings

GEN (Genetic Engineering and Biotechnology News) has released their own, more comprehensive biotech hub rankings.  As expected, Boston-Cambridge and the Bay Area are at the top, with San Francisco edging out Boston in most criteria.  The GEN rankings include not only VC investing, but also data on biotechnology patents since 1976, an estimate of the the number of biomedical employees in the area, NIH grant dollars, and regional square footage of lab space.  Some of those numbers are pretty eye-popping.  According to GEN, the Bay area alone has nearly 30 million square feet of lab space, and has generated over 3,400 biotechnology patents.  Boston-Cambridge meanwhile clocks in at 2,900 patents, and nearly 19 million square feet of space (with another 3 million+ coming on line in the near future).

Posted in Biotech, Innovation, Research | Tagged , , , | Leave a comment

Venture Funding of Biotech is VERY Concentrated… and Very Limited

Fierce Biotech has released their latest analysis of venture capital funding of biotech in the U.S. last year, broken down by metropolitan area:

Top_VC_CitiesSource: http://www.fiercebiotech.com/story/top-15-cities-biotech-venture-funding/2014-03-06

San Francisco is back out in front, edging out Boston-Cambridge.  After San Diego and Washington, funding tails off rapidly, and by the time you hit #15 Chicago, you’re down to about 1% of the pie.

There are a couple of key takeaways.  First, biotech startup funding is very concentrated, with the big three cities (San Francisco, Boston-Cambridge, and San Diego) totaling more than 60% of all funding ($2.5B).  That number, incidentally, is 25% more than the total of all biotech VC funding in Europe.

Second interesting point: there really isn’t that much VC funding of biotech!  The total is only about $4.5B in the U.S., something just under $2B in Europe, and a smattering elsewhere.  In other words, the level of annual global VC biotech funding is smaller than the R&D budget of just one large pharma company.  Keep that in mind when you hear multiple pharma companies planning on sourcing a large part of their pipeline from licensing and acquisitions.

Finally, recognize that distribution?  Yup, Pareto.  Apparently biotech hub performance is just like employee performance.

Posted in Biotech, Innovation, Research | Tagged , , , , , , | Leave a comment

Icicle Forest

Icicle Forest, February 2014

Eastern Massachusetts, February 2014

Posted in Photography | Tagged , , , | Leave a comment

Employee Performance Does Not Follow a Bell Curve

Here’s a great post written by Josh Bersin that’s gotten a lot of attention in the past few days:  The Myth of the Bell Curve.  As I mentioned several weeks ago, more and more evidence shows that employee performance in the 21st century does not follow a bell curve, and that forced stack ranking is a destructive practice.

As Josh explains, employee performance more typically follows a power law (or Pareto) distribution with a sharp peak of just a few highly performing employees, followed by a long tail of what might be considered below average performers.  But a better way to think of the distribution is of a set of hyper-performers embedded within — and often tremendously helped by — a large pool of normal performers.  Yes, there are occasionally truly poor performers, but it makes no sense to arbitrarily label 5-10% of all employees as such.  In fact, even without forced bottom rankings, there are several ways in which the assumptions behind bell curve-driven evaluation schemes hurt performance and undermine the teamwork and collaboration essential for success in a modern information-based economy (or academic environment).

Posted in Fixing Big Pharma Research, Management | Tagged , | 3 Comments

P values. I do not think that value means what you think it means.

Regina Nuzzo has a news feature in Nature about P values that every biomedical scientist should read.  P values — the common measure of statistical significance (i.e. the “believability” of an experiment) — do not mean what most scientists think they mean.

The P-value calculation was originally developed in the 1920s by the statistician Ronald Fisher as a way to judge whether an observed result was worth looking into further.

Researchers would first set up a ‘null hypothesis’ that they wanted to disprove, such as there being no correlation or no difference between two groups. Next, they would play the devil’s advocate and, assuming that this null hypothesis was in fact true, calculate the chances of getting results at least as extreme as what was actually observed. This probability was the P value. The smaller it was, suggested Fisher, the greater the likelihood that the straw-man null hypothesis was false.

For many biomedical experiments, an experiment with a P-value below of 0.05  or 0.01 is considered “statistically significant”, and therefore interpreted as a believable result.  Many experiments can have calculated P-values of 0.001 or even lower. Attracted by the apparent precision of a calculated P-value and it’s resemblance to a true probability calculation, working scientists have come to interpret the P-value as the actual probability of their result being correct.  But that is not true.  The P-value summarizes data in the context of a specific null hypothesis, but it does not take into account the odds that the real effect was there in the first place.

The mathematics are complicated, but by one widely used calculation quoted by Regina, a P-value of 0.01 actually corresponds in the real world to an 11% probability that the experimental result might be due to random chance.  For P=0.05, that the probability rises to 29%!  Even worse, some scientists are guilty of data-dredging or “p-hacking”, the practice of trying different conditions until you get the resulting P-value you want.  As a consequence, the P-value assumptions of random sampling go out the window and, if you’ve tortured the data enough, the calculation becomes meaningless.  No wonder that the overall level of reproducibility of biomedical research has been called into question.

A statistically significant P-value is in fact just an invitation to repeat the experiment.  A practicing scientist needs to realize that, even with a highly “significant” P-value, there is still a relatively high probability that the result will not repeat.  The best advice — something that I learned in the first week of grad school — is that you shouldn’t believe anything until you see n=2.  Better yet, n=3.

Posted in Data Analysis | Tagged , , | Leave a comment