The all-meat carnivore diet is so controversial that advocates and critics haven’t stopped arguing for years. It’s easy to get confused by what appears to be conflicting information. Let’s break it down and take a closer look at these arguments – and the science behind them.

What is the Carnivore Diet?

You may have landed here from our previous article on nutrient deficiencies in the carnivore diet, which details a list of potentially dangerous deficiencies that can result from the diet.

If so, then you know the carnivore diet is exactly what it sounds like: a diet plan that only includes meat. If you haven’t read our nutrient deficiency post, you might want to start there.

The diet is controversial. Its advocates claim that it improves brain function and stops autoimmunity and inflammation in its tracks. Critics point to potential deficiencies and a lack of well-powered research supporting such a radical diet. In the online carnivore and zero carb communities, the argument gets pretty heated. It can be hard to take a step back and think critically.

We’d like to address some of the most common claims and arguments presented by carnivores. After all, it’s important to do your research, to know all the facts, and to avoid dogma of any kind.

Common Arguments from Carnivores

1) “Recommended Intakes Were Developed for the Western Diet, Not a Carnivore Diet”

Within the zero carb community, the most common critique of the nutrient deficiency argument is that daily recommended intakes were calculated for people eating a Western diet. Critics insist that people eating a carnivore diet have different (they say lower) requirements for vitamins, minerals, and fiber.

No evidence supports this line of thought. In fact, carnivores may often need more than the recommended intake of antioxidant nutrients. Eating a lot of meat – especially if cooked at high temperatures – raises oxidative stress [1].

The recommended intakes may also be too low for some individuals, depending on their health and habits. Experts and researchers make these recommendation based quantities that will meet nutrient requirements in nearly all healthy people. These are called Recommended Dietary Allowances (RDA). If there’s not enough evidence to come up with an RDA, Adequate Intake (AI) is used [2, 3].

However, there are several issues with these values:

Nutritional requirements are influenced by personal genetics [4]. The values only meet the needs of 97-98% of all healthy people – not 98% of all people. One in three of all adults suffers from multiple chronic health issues worldwide. In America, nearly 45% of the population has at least one chronic disease. Plus, requirements for certain nutrients rise as people age [5, 2, 6, 7]. The recommendations don’t take into account factors that could affect nutrient digestion, absorption, and use in the body [2]. They are also not adjusted to various other influences, such as climate, physical activity, and medication use [2]. Finally, the recommended daily intake for some nutrients hasn’t yet been set [2].

Look into the factors that apply to you to figure out if you may need more or less of any given nutrient. Such modifications will not dramatically differ from the recommended values, however. RDAs/AIs are still your best rough bet, at least until individualized guidelines come out.

If you have your heart set on the carnivore diet, we strongly recommend working with a licensed nutritionist or a doctor to ensure that you don’t seriously damage your health in the process.

The Potassium Exception

One study has investigated potassium and sodium balance in athletes eating a low-carbohydrate, mostly meat diet based on the traditional diet of the Inuit. Meat often does not contain enough potassium to reach the daily recommended intake on its own [8].

The participants of this study experienced an initial decline in blood potassium; levels stabilized after four weeks on the ketogenic “Inuit” diet. Once these people had adapted to the ketogenic diet (a process that took about 3-4 weeks), their potassium requirements appeared to decrease from 3.4 g per day to about 1.5 g per day [8, 9].

Hopefully, researchers will conduct additional studies on other nutrients in the future. Until then, it’s better to be safe and try to reach all of the nutrient recommendations. Work with a nutritionist to ensure that your diet is not deficient in important nutrients.

Incorrect Food Labels

Some carnivores claim that the discussion of recommended intakes is largely nonsensical because food labels may not correctly report nutrient content. On the one hand, fair enough: the official nutrient content of various foods has been changing for decades; certain nutrients, such as fiber, are more difficult than others to measure [10].

On the other hand, the USDA’s National Food and Nutrient Analysis Program (NFNAP) exists precisely to prevent and correct such errors. This program, which has been operating since 1997, constantly updates and expands its data and methods. There is no guarantee that the current USDA data is absolutely correct, but it’s the best data available [11].

2) “Nutrients are More Bioavailable in Meat”

Some carnivores argue that nutrients are more easily absorbed from meat sources than from plant sources. In certain cases, this is true. Some nutrients that are more easily absorbed from meat include:

Some plant compounds, such as phytates, may decrease the absorption of iron and zinc, whereas high-protein foods increase zinc absorption. Retinol, the active form of vitamin A, is often available directly from meat, whereas plants contain carotenoids that must be converted to vitamin A in the body [12, 15, 16].

Protein and Nutrient Absorption

Protein intake changes the chemistry of the digestive system and may affect nutrient uptake as well. High-protein diets encourage more acidity in the stomach, which in turn increases the absorption of some nutrients. Low-protein diets reduce the production and flow of bile, which is important for absorbing fat-soluble vitamins (such as vitamins A, E, and D) [17].

Plant-based foods can be more difficult to digest because of their sturdy cell walls. This means that your digestive enzymes can break meat down to release the nutrients with greater ease. Mincing meat additionally increases nutrient bioavailability [18].

Nutrient Deficiencies

On the other hand, meat does not contain some vital nutrients, like vitamin C, that are required to maintain health. Read this post on nutrient deficiencies in the carnivore diet for more.

3) “Humans are Adapted to Eating Meat, Not Plants”

A few popular diets (the carnivore diet included) are based on the idea that modern humans just aren’t eating the appropriate foods for our evolutionary requirements.

There’s a grain of truth to this argument: archaeological and genetic evidence suggests that early modern humans ate a fatty, protein-rich diet shortly before the advent of carbohydrate-rich agriculture [19, 20, 21, 22].

However, there’s a significant logical jump between “early modern humans ate a fatty, protein-rich diet” and “humans aren’t adapted to eating plants.” The first doesn’t lead to the second. Let’s break it down further.

The Neanderthal Question

Some people defend the carnivore diet with the claim that our ancestors ate diets of almost exclusively meat. Take care with these claims; some of them may be based on evidence about Homo neanderthalensis, not Homo sapiens. These are two different species, and Neanderthal contributions to modern human DNA are very small (0% in African populations and up to 2% in Europeans and Asians) [23, 24, 25].

Neanderthals were, for all intents and purposes, functional carnivores. Many researchers believe they relied heavily on mammoths as their primary food source and were probably the mammoth’s most important predator [24, 26].

However, even Neanderthals ate plant foods; archaeological evidence suggests that Neanderthals ate starchy roots and seeds. They may also have consumed plants as medicine [24, 26, 27].

In short:

Neanderthals were not modern humans , and what is “evolutionarily appropriate” for them may not be so for us.

, and what is “evolutionarily appropriate” for them may not be so for us. Even assuming that Neanderthals and modern humans do share an appropriate diet, Neanderthals ate starchy root vegetables and were not strict carnivores.

These facts say nothing about whether the carnivore diet is good or bad for any one person. They simply remind us not to make sweeping generalizations about what is “appropriate” based on inappropriate information.

The Evolution of Omnivory

This much is clear: humans are adapted to eat meat. Our large brains and ability to walk on two legs may be closely linked to increased meat intake and a need to carry meat, respectively [28].

Our teeth and guts are appropriate for a mixed diet. Our molars are smaller and our front teeth and jaws are stronger than those of our plant-eating primate ancestors. Our digestive system shares traits with both herbivores and carnivores. We do not have the short, hyper-acidic gut of a true carnivore, nor the long fermenting colon of a true herbivore [28, 29].

Humans are highly adaptable to our environments. Our bodies are extremely flexible, and most people have a lot of options for diet and good health. Some researchers argue that the phrase “evolutionarily appropriate diet” is meaningless because we evolved to eat almost anything [30].

Of course, that doesn’t mean that a diet of “almost anything” will result in optimal – or even good – health.

Plant Food and Performance

There is no evidence that plant-based food negatively affects the health or physical performance of the average person in the general population.

In a study of endurance athletes, peak performance was not significantly different between vegetarian and omnivorous groups. Obviously, this study says nothing about whether carnivores would have outperformed the others; however, it does show that eating only plants is no worse than eating plant- and animal-based foods together [31].

Plant-based diets have been suspected of producing lower bone mineral density, especially in young, growing adults. This doesn’t appear to be the case, either: when controlling for physical activity, obesity, and smoking, bone mineral density is not affected by a plant-based diet [32].

In Brief

Humans are adapted to eating meat. We are also adapted to eating plants. Our teeth and digestive systems are versatile and flexible; evolution has set us up to eat just about anything.

4) “Fresh, Raw Meat has Everything We Need”

Some carnivores advocate eating fresh, raw or very lightly cooked meat to ensure that the nutrients inside aren’t degraded or otherwise lost. Most notably, vitamin C easily breaks down when heated. Many other nutrients, such as the B vitamins, are water-soluble and can, therefore, be lost with the juices [33, 34].

Take care, however. There are plenty of arguments against eating meat raw.

Bacterial Contamination

If you choose to eat raw or lightly-cooked meat, be aware of the risk of bacterial contamination. Every physical step during meat processing (slaughter, cleaning, butchering) introduces new bacteria into an animal carcass. These include pathogens like Pseudomonas species, Escherichia coli, and Staphylococcus aureus [35].

Organic meat from healthy animals has no less pathogenic bacterial contamination than factory- or conventionally-farmed meat. However, antibiotic-resistant bacteria is much more common in meat from factory- or conventionally-farmed animals [36, 37].

However, organic meat still contains potentially dangerous bacteria.

Parasite Risk

Bacteria aren’t the only potential threat of raw meat. Parasites may be present in the muscles and organs of common livestock, fish, and game.

Raw beef liver may contain Toxocara canis, a roundworm that typically infects dogs but can cause a dysregulated immune system and lesions in the liver and lungs of humans. Raw pork and pig liver (often consumed as sashimi in Japan) can carry tapeworms. Wild game meats like bear and boar may contain Trichinella roundworms. Raw and undercooked salmon often contains Anisakis roundworms that quickly produce food poisoning-like symptoms [38, 39, 40, 41, 42].

Deep freezing for 3-4 weeks will kill many, but not all parasites. Some species of Trichinella can survive being frozen for more than a year [43].

The Genetics of Cooking

There’s good evidence that humans evolved (in the last million years, but before Neanderthals and modern humans diverged) to eat cooked food, rather than raw [29, 44].

One reason for this adaptation is that cooked food has more available energy than raw food. The same cut of meat provides more energy cooked than raw [29].

5) “We Don’t Need Fiber to Stay Healthy”

The medical consensus is that we need fiber to make sure we have healthy bowel movements and prevent constipation, and carnivores claim this isn’t true.

That’s not the only point of contention, however.

Stool Quality & Constipation

Strangely, reducing or eliminating fiber resolved chronic constipation, as well as the pain, bloating, and bleeding associated with it, in two studies with a total of 779 participants [45, 46].

In a study of 63 people with constipation of unknown cause, eliminating fiber completely reversed their symptoms [46].

Magnesium Absorption

Phytates in high-fiber foods such as fruits, vegetables, and grains could potentially decrease the amount of magnesium absorbed in the gut. This may expose people who eat a lot of plant-based foods to developing magnesium deficiency, which in turn causes gastrointestinal symptoms, fatigue, and weakness [47, 48].

On the other hand, many plant-based foods are rich enough in magnesium to make up for this effect. Spinach, nuts, and seeds (such as pumpkin seeds) are especially high in magnesium [49, 50].

High protein intake may also improve magnesium absorption, but it doesn’t seem to matter whether the proteins are animal- or meat-based [51].

Healthy Gut Flora

Dietary fiber promotes the growth of gut bacteria that are generally recognized as healthy and beneficial. People who eat a lot of complex carbohydrates tend to have more Prevotella, a type of bacteria that ferments fiber into short-chain fatty acids (SCFAs) like butyrate. SCFAs, in turn, nourish the intestinal wall and protect against inflammation and blood clots [52, 53].

Modern-day tribes who have little contact with the developed world also tend to have the most diverse gut flora out of anyone on Earth. High gut flora diversity is associated with both improved health and increased dietary fiber intake [52, 54].

By contrast, low-fiber diets have been associated with unhealthy gut flora that degrade the mucus lining the intestinal walls [55].

A rabbit study found that the ratio between starches and fiber determined the composition of bacteria living in their intestines. It is possible that the presence of starch is more of a “problem” than the absence of fiber, but this possibility has not been formally researched [56].

So how exactly can high-protein (low-fiber) diets disrupt the gut microbiome? Some researchers have suggested that incomplete protein digestion increases the growth of certain bacteria in the colon that create toxic byproducts (so-called “putrefactive metabolites”). This is even more likely in people with intestinal diseases, including IBD [57, 58].

In one study, people on low-protein weight loss diets (13% protein, low in fiber and carbs) had fewer toxic metabolites in their stool than those following high-protein versions of the diet (29% protein). Still, the exact effects of a carnivore-like diet on the microbiome have yet to be researched [59].

Cardiovascular Disease

A diet rich in water-soluble fiber can decrease cholesterol, blood lipids, blood pressure, and risk of cardiovascular disease [60, 61, 62].

Weight Management

Dietary fiber is inversely associated with weight; that is, people who eat more fiber tend to have lower body mass. Fiber is believed to support weight loss by increasing feelings of fullness [45, 63, 64, 65].

Again, this doesn’t mean that the carnivore diet will make you gain weight; in fact, some people lost weight after some time eating only meat. Furthermore, strict adherence to a high protein diet also supported weight loss in some studies [66, 67].

Even in an omnivorous diet, eating lean beef prevented cravings and feelings of deprivation, two factors associated with successful weight loss, in a study of 120 people trying to lose weight [68].

6) “Plant Antinutrients Cause Disease”

This is a complicated claim. Some plant compounds may produce inflammatory symptoms and diseases, but not in all people.

Food intolerance, also called food sensitivity, may result when a person’s body does not have adequate natural defenses against mildly toxic (or even otherwise beneficial) compounds.

You may have heard of “leaky gut,” also called intestinal permeability. This is a theoretical condition whereby the natural barriers between the intestine and the rest of the body break down, allowing potentially harmful compounds to pass through [69].

Some compounds found in plant-based foods are believed to break down these natural barriers and cause inflammation, autoimmunity, and disease. Compounds that have been reported to be associated with health conditions include [70]:

Lectins [71]

Histamine [72]

Gluten [70]

Saponins [73]

Salicylates [74, 75]

Sulfites [76, 77]

Glutamate [78]

Most people have well-developed natural defenses or an immune system that is suited for plants and will not experience the symptoms of food sensitivity [79, 80].

Those who are sensitive may develop celiac disease, rheumatoid arthritis, and other autoimmune conditions [81, 82, 83].

If you believe that you are sensitive to something in certain plants, we strongly recommend working with a doctor or nutritionist to identify and safely avoid the culprit.

Hormetic Stress

Proponents of the carnivore diet often say that plant foods contain toxic chemicals that put undue stress on our bodies. Indeed, some bioactive plant compounds do cause stress – but sometimes, that’s a good thing [84, 85].

It’s called hormetic stress: a period of stress, usually brief and mild, that activates the body’s natural defenses for an overall positive effect on health. This is how many medicinal and health-food plants increase the expression of genes associated with a faster metabolism, longevity, and the antioxidant defense [84, 85].

As a general rule, whole fruits and vegetables are healthy for the overwhelming majority of the population. In fact, many studies show that consuming fruits and vegetables is associated with reduced risk of dying from all causes. The optimal level of consumption was found to be 7-10 servings a day of fruit and vegetables [86, 87].

7) “Plants are Fine as Medicine, but Not As Food”

Some proponents of the carnivore diet will claim that plant foods cause disease while also accepting that plants can and should be eaten medicinally. The line between medicine and food is difficult – perhaps impossible – to draw. The link between diet and health has been established at least since the age of Hippocrates, more than 2400 years ago [88].

Nevertheless, it is important to understand which plant compounds (if any) may cause sensitivities and which are beneficial for each individual.

For example, some people may be sensitive to plant saponins and develop inflammation after ingesting them. On the other hand, these same compounds can stimulate the immune system, reduce cholesterol, prevent cavities, and decrease some people’s risk of developing cancer [73, 89].

One person’s medicine is another’s poison. It’s up to you (and your doctor or nutritionist!) to determine which compounds are harmful and which may be medicinal to you.

Carnivorous Cultures

Generally speaking, the colder the environment, the less plant life grows there. This means that people who traditionally live in very cold climates tend to eat more meat and less plant matter. The Chukotka people of Siberia and the Inuit and Eeyouch (northern Cree) of Canada are great examples of such cultures [90, 91, 92].

However, none of these cultures completely cut plants out of their diets; in fact, they would go to great lengths to gather and eat fruits, roots, and medicinal herbs during the warmer seasons [90, 91, 92].

Consider the Eeyouch of Eeyou Istchee, a bitterly cold territory in northern Quebec. The vast majority of a traditional Eeyouch diet is made up of wild game like moose and caribou, fowl like goose, and fish. However, during the summer months, they gather huge quantities of wild blueberries and Labrador tea. All year round, they make tea with white spruce needles and gather a variety of medicinal plants and mosses [92, 93, 94].

Even the Maasai people of Kenya and Tanzania – whose famously carnivorous diet is made almost entirely of cow’s blood, meat, milk, and honey – eat herbs, roots, and tree bark as part of traditional medicine [95, 96].

8) “The Carnivore Diet Cures Disease”

On the other side of the coin is the claim that the carnivore diet can cure disease. This claim is no less complicated than the last.

Ketosis and Disease

Ketosis is a state wherein the body cannot get energy from circulating glucose, so it turns to fats as a primary energy source instead. This is achieved through a ketogenic diet, in which up to 80-90% of calories come from fat rather than protein and carbohydrates [97].

Recent modifications to the ketogenic diet lower the calories from fat to about 55-60%, those from proteins to 30-35%, while the remaining 5-10% comes from carbs [98].

Ketogenic diets have recently become the focus of case studies on prevention and management strategies for cardiovascular and neurodegenerative diseases and for some cancers. Some people with epilepsy reduced or eliminated their seizures while they were in ketosis. Surprisingly, a high-fat ketogenic diet also improved and prevented non-alcoholic fatty liver disease in 5 patients [97, 99, 100, 101, 102, 103].

However, these case studies and tiny pilot studies should not be used as grounds for any medical claims. Clinical studies with large groups of subjects and controls are required to determine whether any actual link exists between ketogenic diets and disease outcomes.

The Potential Dangers of Ketosis

Ketogenic diets have their detractors in the scientific community, and ketosis has been linked with health problems of its own:

In the early stages, ketosis may cause dehydration and electrolyte loss. It is important to drink a lot of water and replenish your electrolytes to avoid other complications [98, 104, 105].

Some doctors argue that ketosis may increase the risk of kidney stones, high cholesterol, impaired immune function, eye disease, osteoporosis, and protein deficiency. These side effects have mostly been reported in epileptic children treated with ketogenic diets [106].

The same diet can have completely different effects from person to the next, depending on factors such as genetics, ethnicity, and health status. Blood cholesterol levels are a reliable indicator of the way your diet is affecting your risk of several diseases; it’s important to talk to your doctor before starting any new diet so they can keep an eye on health markers like cholesterol [107, 108, 109].

According to some studies, high-protein ketogenic diets may increase bone resorption and, thus, the risk of osteoporosis. The connection is unclear, however; other sources suggest that high protein intake decreases the risk of fracture. If you are on a ketogenic diet of any kind, make sure you get enough calcium to keep your bone mineral density high [110].

9) “The Carnivore Diet Cannot Give You Scurvy”

The issue of vitamin C may be the most contentious thing about the carnivore diet. Does meat have vitamin C? Do carnivores even need vitamin C? What are its functions in the body, and do these change on a diet high in carnitine or collagen? Unfortunately, there are no studies directly exploring these questions; we can only infer based on the best information we have.

Yes, Carnivores Can Get Scurvy

First, let’s establish that it is entirely possible to get scurvy if you eat nothing but muscle meat. In fact, some researchers have suggested that scurvy is under-diagnosed in modern medicine because doctors assume that it is a disease of the past. In one case, a man who ate mostly canned beef developed scurvy twice; both times, his disease reversed its course after vitamin C supplementation [111, 112].

In another case, a child in Dubai developed scurvy after his parents fed him nothing but meat for two years. Children on prescribed ketogenic diets for epilepsy are also at risk of scurvy and may need to supplement to prevent disease [113, 114].

Exclusively eating meat has been identified as a risk factor for scurvy. Does that mean it’s impossible to get enough vitamin C from a carnivore diet? As we wrote in our article on the nutritional risks of this diet: no. Organ meats like thymus and spleen can help keep vitamin C reserves up [111, 115, 116].

But to get your vitamin C from organ meats, you’d need to eat them raw or undercooked. As mentioned above, this carries significant risks of its own.

Even if your intake is too low and your stored vitamin C is depleting, it may take months or years to develop scurvy. Just because you don’t have scurvy right now doesn’t mean you’re not at risk of developing scurvy later [117, 118, 119].

If you insist on being a strict carnivore, eat thymus and spleen or supplement with vitamin C, and talk to your doctor before making significant changes to your diet.

Does Meat Contain Vitamin C?

As we just mentioned, some organ meats like thymus and spleen contain vitamin C. 100 g of veal thymus (often sold as sweetbreads) contains 39.4 mg of vitamin C; 100 g beef spleen contains 50.3 mg of vitamin C [115, 116].

Fresh beef has more vitamin C than either processed or canned [120, 112].

Grass-fed beef also contains significantly more vitamin C than grain-fed beef. Very high quality, pasture-raised South American beef can contain as much as 2.5 mg of vitamin C per 100 g of meat (25 mg per kg), but this is the exception, not the rule. If your finances allow, buy the highest-quality grass-fed beef you can find. If not, consider supplementing [120].

Factors Increasing Vitamin C Requirements

Optimal vitamin C intake is different for different people. Plus, our requirements change depending on diet, lifestyle, and disease [121, 122].

Smoking and high sugar intake can increase vitamin C requirements.

Inhaling smoke increases oxidative stress, so the body uses up vitamin C in its antioxidant capacity. Smokers need 35 mg more vitamin C per day than non-smokers. People who breathe a lot of secondary smoke or are exposed to high levels of environmental pollution should also increase their vitamin C intake [122, 123].

Meanwhile, sugar and vitamin C compete for the same transporter for uptake into cells. As a result, people with higher blood sugar need more vitamin C. Some of the symptoms of diabetes (with chronically high blood sugar) are remarkably similar to scurvy, perhaps for this reason [124, 125].

Proponents of the carnivore diet may point to this transport competition when they claim that carnivores need less vitamin C. This has not been investigated, and thus there is no evidence backing up such a claim.

Do Carnivores Need Less Vitamin C?

Proponents of the carnivore diet often claim that increased glutathione intake and decreased carbohydrate intake decrease vitamin C requirements. They say that because meat contains lots of glutathione and zero carbs, carnivores need less vitamin C [126].

To defend this claim, they often point to the same two studies:

In a study of guinea pigs (which, like humans, cannot make their own vitamin C), glutathione ester supplementation completely prevented scurvy from vitamin C deficient diets. Some of the antioxidant functions of glutathione and vitamin C overlap, but we don’t currently know if one compound can replace the other [127, 128]. In a rat study, low- or zero-carbohydrate diets caused the animals to produce significantly less vitamin C. Carnivores sometimes claim that this means the rats needed less vitamin C when they ate fewer carbs [129].

These studies are not appropriate for evaluating whether glutathione or carbohydrate intake affects vitamin C requirements in humans. They are animal studies, which do not necessarily translate to humans, and there are several other problems with using them to defend the carnivore diet.

The guinea pig study used glutathione ester supplements rather than glutathione itself because oral glutathione is not well absorbed. The enzyme γ-glutamyl transpeptidase (GGT) breaks down glutathione in the human intestine; the body must later make more of its own. The glutathione content of meat doesn’t really matter if it will be broken down and remade anyway [127, 130].

The building blocks of glutathione are probably more important than glutathione itself. The human body makes glutathione from cysteine and other sulfur amino acids [131].

Aside from meats, dietary sources of cysteine include soybeans and walnuts. Plus, cruciferous vegetables boost glutathione through their sulfur compounds. So carnivores won’t necessarily get more glutathione or benefit from it more than people who eat plant-based foods [132, 133, 134, 131].

Rats, unlike guinea pigs and humans, can make their own vitamin C. In one study, when rats were fed low- or zero-carbohydrate diets, they produced significantly less vitamin C. According to some carnivores, this means that animals need less vitamin C when they eat fewer carbs [129].

The central problem with this interpretation is, of course, that humans cannot make vitamin C; rats are therefore irrelevant to understanding human vitamin C metabolism. Plus, the rats in this study didn’t just eat fewer carbs, they were starved. They didn’t make vitamin C because they didn’t have the building blocks, mainly glucose, required to do so [129].

It’s better to be safe than sorry, and it’s dangerous to assume that you don’t need vitamin C based on a slice of a rodent’s liver and some anecdotes you read online.

Collagen

Some carnivores have claimed that eating collagen or carnitine reduces or eliminates the need for vitamin C.

Proponents claim that because the human body requires vitamin C to make its own collagen, eating collagen reduces vitamin C requirements. However, when we ingest collagen, we break it down into prolylhydroxyproline (Pro-Hyp) and hydroxyprolylglycine (Hyp-Gly). Fibroblasts (connective tissue cells) then use Pro-Hyp and Hyp-Gly to synthesize new collagen [135, 136].

Collagen synthesis still requires vitamin C. Furthermore, the combination of vitamin C and collagen peptides is better for skin health than either alone [137, 135, 136].

Carnitine

Carnitine, meanwhile, is a compound that transports fats into the mitochondria so that they can be burned for energy. Vitamin C supports one of the enzymes that help make carnitine in the body, but may not always be required for its function [138, 139, 140].

Meat, especially beef, is rich in carnitine, which is readily absorbed in the small intestine. The more carnitine we get from our diets, the less we presumably need to make for ourselves. This does suggest a reduced need for vitamin C; however, we do not know what proportion of vitamin C is used for this purpose [138, 140].

Increased fat intake may increase our carnitine requirements over time as well. When carnitine transports fatty acids for energy production, it is consumed; in order to get more energy from fats, then, we need more carnitine. Dietary carnitine is absorbed at a rate of 54-87% in the intestine. If our bodies need more, they make it in a process supported by vitamin C [141, 142, 143, 144].

In Brief

Vitamin C is a nutritional requirement whether or not you eat plants, collagen, carnitine, or anything else. High blood sugar, environmental pollution, and tobacco smoke increase your requirements.

Consuming more carnitine or collagen has an unknown effect on vitamin C needs, but the impact is likely insignificant. Eating grass-fed beef, thymus, and spleen will ensure that you get vitamin C in your carnivore diet. Work with a doctor or nutritionist to avoid this dangerous deficiency.

The Greenhouse Gas Problem

The carnivore diet’s long-term effect on health is poorly studied and not well understood. Besides these concerns, however, is livestock agriculture’s effect on greenhouse gases and climate change.

Broadly speaking, raising animals like cattle produces more greenhouse gases than growing plants. If more people switch to a carnivore diet, those people will increase their carbon footprint and contribute more per capita to climate change. Thus, the carnivore diet is not a sustainable option for a majority or even a plurality of the world population [145].

People who want to eat a carnivore diet while minimizing greenhouse gas emissions should consider eating local meat and a “nose to tail” diet using as much of a single animal as possible, raised as nearby as possible, with as little transportation and processing as possible. They can also choose to eat lean, energy-efficient species that are native to their area, such as kangaroo meat in Australia [146, 147].

Another creative option to avoid plants and reduce one’s carbon footprint is to consider eating insect products like cricket flour [147].