According to reductionism, every system, no matter how complex, can be understood in terms of the behavior of its basic constituents. The focus is on the bottom layer of the material chain: matter is made of molecules; molecules of atoms; atoms of electrons, protons, and neutrons; protons and neutrons of quarks; we don't know if the buck stops here or not.
At the biological level, organisms are composed of organs; organs of cells; cells of organic macromolecules; those of many atoms, etc.
The more radical reductionists — lost, perhaps, in the fog of the eighteenth century's giddy mechanization of reality — claim that all behaviors spring from a few fundamental physical laws: when we uncover these laws at the most basic level, we will be able to extrapolate to higher and higher levels of organizational complexity.
In practice, reductionists know (or should know) that this extrapolation is impossible: studying how quarks and electrons behave won't help us understand how a uranium nucleus behaves, and much less genetic reproduction or how the brain works. Hard-core reductionists would stake their position as a matter of principle, a manifesto of what they believe is the final goal of fundamental science, the discovery of the symmetries and laws that dictate (I'd say "describe" to the best of our ability) the behavior of matter at the subatomic level.
There is no question that we should celebrate the triumph of the reductionist approach of the first 400 years of science: much of the technological innovations of these past four centuries come from it, as does our ever-deepening understanding of how Nature works. In particular, our digital revolution is a by-product of quantum mechanics, the branch of physics that studies atoms and subatomic particles. Let's then quickly examine how difficulties emerge as we go bottom-up.
We know how to describe with great precision the behavior of the simplest chemical element, the hydrogen atom with its single proton and electron. However, even here trouble lurks as we attempt to include subtle corrections, for example adding that the electron orbits the proton with relativistic speeds (i.e., close to the speed of light) or that its intrinsic rotation (or spin) gives rise to a magnetic force that interacts with a similar magnetic force of the proton. Physicists take these effects into account using "perturbation theory," an approximation scheme that adds small changes to the allowed energies of the atom.
Physicists can also describe the next atom of the periodic table, helium, with considerable success due to its high degree of symmetry. But life gets complicated very quickly as we go up in complexity. More drastic and less efficient approximation schemes are required to make progress. And these don't include the interactions between protons and neutrons in the nucleus (which calls for a different force, the strong nuclear force), and much less for the fact that protons and neutrons are made of quarks and gluons, the particles responsible for the strong interactions.
Physics is the art of approximation.
We learn to dress down complex systems to their bare essentials and model them in as simple terms as possible without compromising the goal of understanding the complicated system we started from. This process works well until the complexity is such that a new set of laws and approaches is necessary.
At the next level of complexity are the molecules, assemblies of atoms. In a very rough way, all chemical reactions are attempts to minimize electric charge disparities. How many molecules can exist? Let's jump to biochemistry for an illustration. Proteins are chains of amino acids, designated by an underlying DNA code. Since there are 20 different amino acids and a typical protein has some 200 of them, the number of possible proteins is around 20200. Increasing the length of the protein and hence the possible choices of amino acids leads to a combinatorial explosion. Physicist Walter Elsasser coined the term "immense" to describe numbers larger than 10100, a googol (a one followed by 100 zeros). Thus, the number of possible proteins is certainly "immense." We only see a small subset realized in living creatures.
The number 10100 is not arbitrary. Elsasser showed that a list containing 10100 molecules would require a computer memory containing more than all the matter in the universe. Worse, to analyze the contents of the list we would need longer than the age of the universe, 13.7 billion years. We can't know all possibilities.
At least chemists and biochemists will never be out of work.
There is an immense number of new molecules with unknown properties to be explored. The same with the number of genetic combinations, cell types and mental states.
It is thus impossible to predict the behavior of complex biomolecules from a bottom-up approach based on fundamental physical laws. The passage from one level of material organization to the next is not continuous. New laws are required for different layers of material organization, new ways of thinking, generally summarized as "complexity laws." This is a somewhat novel area of research that is attracting both friends and foes. A technical book is available online.
Nobel prize winner Philip Anderson wrote a prescient essay in 1972, More is Different, where he argues for this layering of physical laws, which are irreducible: we can't deduce laws from a higher layer from going upwards. The reductionist program met a brick wall.
So are the days of reductionism over? Not at all. But there is a clear need for a complementary approach based on this new vision for how Nature works. An open question is whether the new program, that of obtaining a finite set of complexity laws is feasible or is still a child of reductionism.
Perhaps Nature's unruliness will lead to an immense number of laws and the best that we can do is identify those that describe the systems we can probe.