Carlo Quinonez

I am a postdoc at UCSD. I am doing the development of open hardware. Mainly concerned for use in our lab. It has grown into something much more in the last two years. I think the impression of how JOe said what biology is and how simple it is. It's from SDCSB, a group of national groups for systems biology, funded by the NIH. There's only 13 across the country. It's to foster collaboration and innovation, systems-level biology.

The biology is simple at the core, the stuff in your textbooks, the stuff of 10 or 20 years ago. The challenges that we are facing are much deeper and challenging. Pharma doesn't need the same thing they used to. We focus on three main research areas. Only one of them is biology. Networks and mathemetical modeling; it's for cellular analysis. And uh, so, the group is studying immune system regulation. It's like proteins or single proteins or local feedback netwrok, in this particular map is the best guess of how inflammation is regulated. Just inflammation. Now, keep in mind that your cell is controlled by many many such networks. Really understanding biology requires understanding these giant maps. There is a hurtling challenge to mathematically model a network like this, how much data do you need to fit a model?

The answer is astronomical. It's a huge number. You need enough numbers to get a robust feed for most models. So you need better fitting with less data. You still need a crapton of data. Metric crap ton. Hypothetically we want to study that netwrok by mutating some of those components, we know the assays, we have five steps, you can have one worker at it, we can do- they can do 10 plasmids or samples at once. If you have a higher scope person they can process more samples.

The better training you get, the more likely you get wherey oud on't actually work. The solution of course is that labor scales linearly. Unfortunately, this has been taken in labs where there are science sweatshops. This becomes an issue because the people collecting data at your frontlines are entry level grad students and low level postdocs. Technology has some promise here.

You can replace workers with machines that can do the jobs better faster and so on. In some cases you can get rid of more workers. You can improve throughput and presumably unemployment goes up. Depending on the assay in the specific steps that are involved, you can completely elliminate manual labor from assay chains. Having machines to process the assays isn't enough. Because samples don't magically move from machine to machine. But of course, most people realize that, it's possible to completely automate the process.

The innovation that enabled that was the standard plate format, from the last 15 years, the microwell plate. The standard microwell plate, it has 96 wells, then they have some 396 and 4x the 1536 and 4x more the 356 well plates. 3456 wells in a format this big. Unfortunately they realized that there are diminishing returns, and the more you attempt to put on a plate, the more effort to develop these technologies to develop this robustly.. There is a lot of biology that is done with 396 plates. These are very common name, the 1536 well plates. If you guys want to see what this looks like, ... this is what an automated lab looks like. You have the different machines and stations, and the robot is essentially accessing all the different relevant parts of the labs. You can imagine that setting up and programming labs like this is very expensive. They are very ideosyncratic, requiring full-time engineers to keep running. The only people who can access these facilities are the lead pharma companies or the researchers who develop assays to run on these.

Biology is full of individuals who are very smart and talented with tons of free time. They are developing microfluidics. This is the name given to a cluster of technologies all having something in common- it's fluid handling at a scale small enough such that you are on the laminar flow regime. Water behaves differently and there are some advantages to get improved reproducibility and to reduce the usage of samples. Scientists have done som amazing stuff iwth that. There are many journals full of proof of concepts of the most amazing things ever: single cell multiplex ELISA, single cell PCR, single plasmid PCR, they are mostly proof of concepts but they don't go anywhere outside of the lab. We'll talk about why in a second.

The schematic here is the red- is closed boxes. The green boxes are open valves. And the dark, well, it looks the same shade here. Depending on the specific open/close of the valves, flow gets steared on a chip through the channels. That little box that we're blowing up, we're going to play a movie of that to see what happens to cells ad how cells get deposited and how these can do automated assays. So, ....... do you guys see little cells floating in there. Lights? They are nicely positioned into these horizontal chambers, and now a different set of valves, and now you can apply your stimulus or whatever else it is that you want to do with the cells. All biology and chemistry is just moving fluids around tubes. All it is, that's all it is.. humans robots moving liquids around.

How does microfluidics alter this? The answer today is actually not much. Despite these microfluidic chips that process 100,000 samples in a single chip. The problem though is that they have never really contributed significantly to the overall throughput of the assay pipeline. In order to get high numbers of processivity requires multiplexing, the less steps of these assay chain you can do. You're not altering the paradigm, you still have to take one sample from one machine to to the other. It's a little more complicated, as I mentioned. It's a question of tradeoffs, a lot of people think of this as setting a fixed number of gates per chip. The more gates you devote to one function, the less you have to devote to another system. The micro total analysis system has a number of steps, but because ith as a number of space to the assay, it has - a tradeoff- you can't multiplex it as much and you have fewer lanes.

This could be paradigm shifting, once you get an assay pipeline in a chip, now you don't have to worry about moving samples around and you reduce costs by reducing the number of chips. If you look at the size of the dollar bill, it's still a lot smaller than a room-sized machine setup.

So what are the issues or realities with microfluidics? The design o fteh chip and the positioning of the valves enables certain assays. A certain biology/microchip paper, ends up using two or three microfluidic designs each taking maybe a year to design or produce. It takes a year to figure out where to place the components to get him the assay that he wants. Microfluidic devices are like ASICs, very application specific.

Let's just assume that we have a good enough assay. Let's say that the assay does whatever we need, it's good enough, why can't we just do this? We have tons of underemployed or unemployed people, put them on this and let's do this. THe problem with microfluidics needs expert users to setup. I will show you that in just a second. It's the same reason why you can't do this either. Microfluidics is not ammenable to robotic automation, at least not without a huge investment.

This is an example of a typical microfluidic setup. You can see, I'll try to describe to you, along the top is rows and rows of pneumatic valves that the experimenter bought and setup. Tubes everywhere. To run a valve.. it can take about 200 or 300 pneumatic lines to operate all the valves and move fluids around. So assay-specific hardware to run this, it's almost as variable as the chip itself. The development of hardware, a medical instrument is very expensive, so companies that have solutions for automated microfluidics focus on narrow high-throughput markets like genotyping, where you have 100's of thousands of something.

Another related problem is that in science, it depensd on how you did the paper, and this is a big problem for open science. If you want to publish your data set, you have to know how it was collected. So, this was controlled by custom software and custom electronics units, no publications. It's about to get good. So how do you get approach #2?

We have these two competing approaches. The top is the one that we have, $10B or more has been invested in that. Why would anyone invest any money in the micro approach when the first works? Microfluidics is the.. of science. They are the... for 15 years we've been promising microfluidics. So what I've been working on is how to make microfluidics to use. So I developed a wetware kernel for microfluidics. It's software, hardware tools and processes that will hopefully eable users to start using it. This is the kernel, it's in design to provide a set of features that are geared to any other incubator out there. This is the heart of the microfluidic system, iit's the incubator, biology happens at 37 degrees Celsius. Some of tehse features are humidity control.

The cost is the most significant thing about it. It's less than $2,000. Competing commercial systems are greater than $10,000. The incubator itself is designed to be a microfluidic all in one interface, solving another big problem in microfluidics- setting them up. The kernel is more than just the hardware. There's also design patterns. Designing microfluidic devices requires a topology for collaboarting on microfluidic devices. So 3D design patterns and templates to describe the molds and interfaces in a way that takes the software that takes objects about object-oriented design, and divide it so that people can make abstract templates. That's the only way to really collaborate on hardware.

The other cool thing is that I have digital object identifiers for the templates, to assign a DOI to a microfluidic template or hardware component. Another big area is fabrication of the devices. Developing a novel apparatus.. this is my gold standard: can an unsupervised undergrad do it in lab, and if they can, then we know we've succeeeded. Controlling the microfluidic device requires a big nasty valve manifold has been replaced with a modular scalable open source platform. It's essentially it's a complicated arduino shield, it's everything you need to run the wetware. Here's an example of trying to run it, and now it is officially hooked up. Usually it takes at least 30 minutes. Now it is mounted into the kernel. This is stuff that, getting it to setup.

There are software components. Aquinas kernel. This is built off of Micromanager, for microscopy control from NIST or NIH. What we're trying to create is a royalty free tool chain for every piece of the puzzle below the wetware itself. You no longer, yeah. Scaling up becomes a possibility. That begs another question. Hardware, who's going to make it all?

Aquinas is all about a distributed manufacutring model which means you don't need centralized model. The resolution of best 3D printers is 15 microns. This creates jobs locally and around the world. People need to build and support their own hardware. It's like a bridge between the software and the hardware, or between the software and the wetware. I see this as the Cray supercomputer in the 1980s, but in 1991 Linux was developed, and today every computer is a cluster, 90% of which runs Linux. Today, it is much more efficient to distribute tasks over large numbers of low-powered processors, rather than a super-powerful super pipeline regardless of how wide that pipeline will be.

In 20 years, what will biology look like? If we can get it working with a shared collaborative platform, in 2032 we'll have automated data centers full of rack mounted aquinas kernels, we'll control them from desktops elsewhere, and companies in this room will be managing all these data by these data centers, and companies here in this room will use these components. Imagine the wave of googles, amazons and ebays, what about the wave of jobs and workers. Running a cloud takes a lot of work. It's not just the computer people that you need. These are the guys making the makerbots.. they're there, but none of this starts happening unless biology learns to work together rather than biotech has been doing forever which is bigger and better walls, or big walls makes for good neighbors.

Aquinas Sciences is a startup, we're commercializing and lciensing this from the University of California San Diego. Our priority is to drive adoption. Our business model is Red Hat for biology, services and installation. The problem is that we have to create Linux first. I'm afraid we'll end like Netscape. That's it. Thank you.

Could you explain a little bit more about why the microfluidics platforms and assays that are currently use are so specialized? The high cost of development of the instrument, then you have to justify the development, 100x return, VCs, and then soon you have an instrument that costs $500k or something, instead of the markets that have that kind of size.. so it's high-throughput sequencing.

Abstract design language, templates, the derived part is the final compile linked part. You get two compiled parts by just adding things that are the same color together. I can make a new microfluidic design including the top part in under half an hour. We want a web based platform. Right now I use CATIA. What about an open source CAD solution? Everything we are developing we are releasing under a dual license campaign, a traditional license, another one that has our open source license, whih will provide for royalty-free commercial usage. But it will also have strong copyleft.