To quickly summarize some relevant human & animal genetics: the behavioral genetics paradigm traces back to the British “biometric” school of genetics which began with Charles Darwin & Francis Galton discussing kinds of “blending inheritance” for the very gradual process of evolution, leading to subtle changes compounding for eons, which Darwin set out in Origin of Species; Galton’s investigations ultimately led to the central role of the normal distribution and central limit theorem and linear models and regression to the mean in mathematically modeling the inheritance of continuous traits like height, which eventually was perfected by R.A. Fisher’s infinitesimal model, where the continuity of traits like height (as opposed to simple discrete Mendelian traits which shift between different phenotypes under the influence of one or a few genes) is due to the trait being caused by the simple additive sum of the average effect from thousands or tens of thousands of genetic variants. This could even account for the countless binary or discrete traits which clearly had genetic influence and ran in families but failed to follow any kind of Mendelian pattern whatsoever, such as alcoholism or schizophrenia, by the liability threshold model where a threshold is defined and the phenotype manifests if the sum of all genetics variants & environmental influences (a normally-distributed variable) passes a certain critical total. This paradigm matches data from twin studies, adoption studies, pedigree and family studies, and has enjoyed immense success in recent decades with the advent of genome sequencing. (And we could also use it to explain other fictional genetic scenarios, like muggles/squibs/wizards in Harry Potter or the 3 pony races in My Little Pony: Friendship Is Magic. ) Intelligence/IQ in particular fits this paradigm well and the consensus is that it is highly polygenic, additive, much of the relevant genetic variants are common ones, and while rare variants & de novo mutations are usually responsible for cases of severe retardation, most below-average intelligence is simply the lower end of a continuum, and there are few or no rare variants which cause extremely high intelligence or which offer a large boost in intelligence.

Given this, the breeding program in Dune makes no sense. (For a good review of quantitative genetics animal breeding methods, see “One Hundred Years of Statistical Developments in Animal Breeding”, Gianola & Rosa 2015; for what Frank Herbert would’ve had access to at the time, see the dean of American animal breeding, Jay Laurence Lush’s enormously influential textbook, Animal Breeding Plans.)

A continuous polygenic trait responds quickly to selection, and in discussing eugenics, even the most pessimistic estimates by R.A. Fisher of how many generations it might take to drastically increase average human intelligence or almost entirely eliminate a nasty recessive might be 20 generations—certainly not “thousands of generations”. This would hold true of other traits one might select for, and selecting for many traits simultaneously would increase the number of generations relatively modestly. Since there are few or no rare variants fostering extremely high intelligence or other desirable traits, all of the necessary variants exist already in the human gene pool and merely need to be increased or decreased in frequency, which can be done rapidly without waiting centuries (or tens of millennia) for “hopeful mutants”. Due to the CLT, for a highly polygenic additive trait, the starting population mean may be extremely distant from the end result of a selective breeding program (no chihuahua will ever have a puppy as big as the average Saint Bernard), so it would be incorrect for the Bene Tleilax to claim that the baseline human population could occasionally throw up extremes anywhere as extreme as their ultimate ‘essences’.

If anyone did such a program, it would be self-defeating to restrict the program to a few aristocrats because the families would regress to the mean and would be constantly diluted by intermarriage from the general population (especially over millennia) and such a restriction risks various ills from a small effective population as it greatly increases the risk of bad luck and exaggerates the effects of genetic drift; it would be best to do it on as egalitarian a basis as possible, and if it has to be limited to a certain size, select anyone with high trait values with little regard for lineage.

Should a high genetic level be reached and maintained, it would then not be especially important to mate this person with that person, since it is only the average which matters, and a misaimed marriage simply means a slight reduction in selection efficiency; nor would there be anything particularly special about a brother-sister inbreeding, other than incurring inbreeding depression & an increased risk of birth defects/genetic diseases (which is not good but tolerable as long as it is not repeated, in some eras such a mating was quite common, and even now cousin marriage is common & preferred in many societies). Likewise, it would not be terribly important to carry the exact cells of particular noted figures to clone them rather than, say, a sibling or several more distant relatives.

So is it all just complete nonsense?

Well, there is an alternative paradigm of genetics, which may be more familiar to most readers: the Mendelian paradigm, traceable to Gregor Mendel and his peas, but taken up with great enthusiasm by Americans. In Mendelian genetics, the focus is overwhelmingly on single genetic variants with large effects, which since they come in pairs can have simple additive dose-response effects (0/1/2 copies), non-additive effects such as “dominance” (one copy is enough to cause the trait), “recessive” (two copies are required), and lead to complicated inheritance patterns where a trait may disappear but then pop up many generations later, or where several genetic variants may only have a particular effect when all of them are present simultaneously (“epistasis”). Mendelian genetics applies well to a number of rare human diseases, and a few oddball traits, but works particularly well in agricultural and scientific settings, where it can be demonstrated vividly and used to track mutations and investigate their effects, among many other things. The development of Mendelian genetics thus lead to a notoriously bitter academic dispute between the biometricians and the Mendelians, because neither side was wrong: there clearly were Mendelian traits which were busily being experimentally demonstrated in plants and flies and mice, but it was also clear that Mendelian approaches couldn’t account for traits like height. (For more background, see Provine’s The Origins of Theoretical Population Genetics, Gilham’s A Life of Sir Francis Galton, and Paul & Spencer 1995.) The feud was only partially resolved by R.A. Fisher’s famous unification demonstrating that the continuous traits could be seen as simply the sum of indefinitely many genetic variants each of which acted in a Mendelian manner. (In particular, Mendelianism was avidly adopted by American eugenicists, who proceeded to interpret traits like low intelligence or alcoholism or schizophrenia as being single Mendelian genetic variants, often recessive, rather than being part of a continuum in which sufferers merely have bad luck and somewhat lower average number of favorable variants. While under a biometrics paradigm, it would be about equally effective to try to increase intelligence by increasing the fertility of more intelligent people versus decreasing the fertility of lower intelligence, under American Mendelianism run amok, all low intelligence/mental illness/disease ‘must’ be due to recessives as proven by sloppy, biased—or perhaps even falsified—pedigree charts of arbitrarily dichotomized traits, and as increasing the fertility of high-trait people is largely futile then, in the absence of any kind of genetic testing coercive government-backed sterilization approaches then become the natural approach—especially for the strong liberal progressive tradition in America which was an advocate of government intervention to reshape society eg Prohibition or minimum wage.)

The Mendelian monogenic paradigm, with heavy emphasis on various nonlinear or interaction rather than simple additive effects, remains highly influential in scientific research. (Indeed, perhaps far more than it should. It seems to me that the disappointed hopes of rapidly finding most disease-causing genetic variants after the Human Genome Project rested on quasi-Mendelian beliefs and disregarding the evidence of high additive & polygenicity of most human traits including diseases, and were responsible for the candidate-gene debacle where almost all candidate-gene hits were shown by later GWASes to be false positives. Indeed, so opposed to the standard behavioral genetics paradigm were many researchers & commentators that they used the initial GWAS null findings, indicating that the “missing heritability” was due to polygenicity with many small effects and thus requiring sample sizes close to n>100,000, as reductio ad absurdum which disproved the entire enterprise; ultimately, of course, those sample sizes were reached and the hits have kept coming ever since.

Why is this the case? Perhaps because it suffers the dual problem of being offensively theoretically simple yet practically difficult to deal with; early Mendelians (eg Bateson) complained of the difficulties of understanding Galton, Pearson, or Fisher’s mathematics, while applying the statistics at all must have been enormously painful in an era where even mechanical calculators were not always available, and the implications are that for some things like GWASes hundreds of thousands or millions of samples were required, all in the service of a theory whose intellectual charm & subtlety are difficult to appreciate, and which seems prima facie false to anyone familiar with the intricate endlessly complex pathways and feedback loops of real biological systems. And yet, ‘it moves’, for all the sophistication and nuance of Mendelian theories reveling in epistasis and dominance. It can be easy to read small n data in Mendelian ways, assuming away anomalies as measurement error and the usual ‘crud factor’ of scientific research—a striking recent example is the 60-year-long mistaken belief that catnip response is a single Mendelian autosomal dominant trait based on Todd 1962’s 34 cats which turns out to be an additive polygenic liability-threshold trait when studied more rigorously with n~210. In addition, the monogenic approach is indisputably successful in describing many dramatic genetic diseases. And, of course, the eugenics implications for humans of Mendelian-style genetics are much less, in exactly the ways Herbert inadvertently illustrates. So perhaps we should not be too hard on researchers who naively expected to find a few dozen genetic variants which could account for most differences in intelligence or health, and which could be found looking under the lamp post using easy samples like n=100.)

In particular, as extended by Sewall Wright, it is heavily used in animal and plant breeding in creating new strains of plants with a specific desired trait, often crossed in from another varietal or even species. In those scenarios, where one is trying not to exaggerate existing traits but to copy an entire novel trait—resistance to a particular pesticide or insect, perhaps, or salt resistance, or a coat color—there may be more than one genetic variant at work, you may need a whole gene copied over from the other organism, perhaps several of them working in concert, acting epistatically, a “gene complex” as Wright dubbed them. Epistasis makes breeding difficult because the new set of genes might be broken up immediately by the recombination. If there are, say, 3 new genes brought over into an organism and it has some offspring with an unmodified organism, each offspring will have, say, 1⁄23 = 1⁄8 odds of inheriting the full set of 3; so of 8 offspring, perhaps 7 will not have the desired trait because only the 8th managed to get all 3 simultaneously. Only 1 or 2 is no good. These are not good odds and complicate things (if you can only see the result when all 3 genes are inherited, how do you know whether there were 0, 1, or 2 in the ones without the trait?). As Lush puts it in his 1943 textbook:

Selection for epistatic effects is somewhat like building a sand pile on the seashore exposed to each incoming wave. It is easy to build a little pile between waves, but each wave which rolls over it tends to flatten out the pile. When building is stopped, some traces remain after the first wave and perhaps even a few after the second and third, but soon practically all traces of the pile are leveled away. If building continues between waves, the pile can be built a little higher before the second and third waves than it was built before the first wave but soon a size is approached which can just be maintained, the building between waves being just enough to repair the leveling action of the preceding wave.

Or as E.O. Wilson would put it, directly considering human geniuses (On Human Nature, 1978):

Truly exceptional individuals, weak or strong, are, by definition, to be found at the extremes of statistical curves…Since each individual produced by the sexual process contains a unique set of genes, very exceptional combinations of genes are unlikely to appear twice even within the same family. So if genius is to any extent hereditary, it winks on and off through the gene pool in a way that would be difficult to measure or predict. Like Sisyphus rolling his boulder up to the top of the hill only to have it tumble down again, the human gene pool creates hereditary genius in many ways in many places only to have it come apart in the next generation.

To investigate or select, one must carry out time-consuming and difficult breeding of multiple generations, various crosses of related organisms, and so on, and one can eventually deduce all of the relevant parameters and introduce everything enough times.

But… what you can do is, once you have managed the cross, create a line of organisms which breeds true for the trait by extensive inbreeding or cloning, “line-breeding”. (The use of inbreeding for developing new lines is generally attributed to early English breeder Robert Bakewell ; breeders then avoided inbreeding as ‘incest’ and worried about inbreeding depression, engaging in extensive cross-breeding of varieties, which while avoiding both of those problems, drastically slows down progress and obscures heritability and inevitably muddles any sharp distinctions.) If you get say 16 offspring from that organism, and take the two sibling organisms with the trait, both of which you know have the full gene complex of 3 variants, and you mate them, then all of their offspring will express the trait because the 3 variants have been fixated within that line. (This sort of incestuous inbreeding approach would also help with purging harmful recessives: because they are so related, offspring will often have two copies of a harmful recessive and it will immediately cause ill health or death, rather than continue floating around the general population.) Or you could clone them and ensure the genes (and thus trait) is preserved that way—cloning is especially common in plants, and many famous plant varieties are propagated clonally because their characteristics would be lost if they were propagated sexually. (Apples are a famous example: wild apples exhibit tremendous variety but typically all taste bad and are useful mostly for making hard cider; the supermarket apple varieties all stem from single “chance seedling” apple trees discovered on farms to be unusually tasty, and then are propagated clonally for commercial sale. The Granny Smith Festival commemorates the legendary discovery of the popular green Granny Smith apple underneath Maria Ann Smith’s kitchen window, although it may also have been found growing in a pile of discarded French crab-apples, while Ginger Gold was accidentally discovered after a hurricane knocked down its surrounding normal apple trees; apple varieties can be discovered when individual limbs of trees mutate, creating bud sports, see Foxwhelp/Gala/Cripps Pink/Winesap. Basically, apples work in real life the way the X-Men work in fiction.) Then you can use that strain directly, or employ it in various other breeding programs while perpetuating the line indefinitely. All of this is common in, for example, plant cultivars or in the special mice & rat breeds (Green 1966) used in lab work. Or if a desired mutation suddenly pops up in an individual, perhaps encouraged by the use of mutagenesis like “atomic gardens”, the mutation will be lost if it is not bred heavily, possibly with relatives. There can be additional advantages to inbreeding selection (see chapter 23/30 of the draft Walsh & Lynch textbook on selection), especially in situations in which past generations can be re-bred, such as with saved seeds or in the case of Dune, gholas repeatedly cloned from old cells.