Monday, October 31, 2011

Is phytate really a problem?

As mentioned in a previous post, there is increasing evidence of adaptation to gluten consumption by humans. This adaptation is not genetic, but symbiotic. It appears that we have developed new symbiotic relationships with specific microorganisms to help us degrade gluten, and by doing so, being able to exploit an unnatural food source. 

Aside from resistance to degradation by mammalian enzymes and creation of neo-epitopes from partial gliadin digested peptides, one common reason given for avoiding gluten by paleo advocates is its phytate content (1, 2). Phytate is an anti-nutrient which binds to and form complexes with proteins, lipids, carbohydrates, and metal ions (zinc, iron, calcium and magnesium) thereby reducing their bioavailability. Phytate is the common name for myo-inositol-(1,2,3,4,5,6)-hexakiphosphate (InsP6). 

scienceblogs.com
From its chemical structure, we can see that it is basically a myo-inositol with six phosphate groups. The ability to degrade InsP6 is conferred by phytases. There are three types of phytases, namely, 3-phytase, 5-phytase and 6-phytase. The differences between these phosphatases is the position on the inositol ring at which the initial attack of a phosphoester bond takes place. Thus, attack by different phytases produce different isomers. Phytase production and activity in humans is relatively low (mainly in the small intestine) (3), so the greatest source of phytases is the gut microbial community . 

Gut flora phytase activity

Of the Bifidobacteria species which predominate in the human gut, the B. catenulatum group (B. catenulatum and B.pseudocatenulatum) is the most common. Haros et al (4) examined the InsP6 degrading capacity of B.pseudocatenulatum ATCC2919, isolated from the human gut. It was found that B.pseudocatenulatum is able to degrade InsP6 in sequential dephosphorylations (starting in the 6-position of the myo-inositol ring, followed by the 5-position). The solubility of mineral chelates of myo-inositol phosphates is related to the number of phosphates per molecule. InsP6 and InsP5 have adverse effects on mineral absorption. On the contrary, breakdown products with 1,2,3-grouping interact specifically with iron, increasing its solubility and preventing its ability to catalyse hydroxyl radical formation. Overall, the mineral-binding strength to inositol phosphates  becomes progressively lower when phosphate are removed from the molecule (with the exception of the 1,2,3-grouping mentioned above). B.pseudocatenulatum also showed selective adhesion to Caco-2 epithelial cells and tolerance to increased concentrations of bile, which reflects its adaptation to the human gut. A previous study (5) found that B.infantis is able to degrade 100% of InsP6, producing InsP3 as the main product. The optimal pH for the phytase activity of B.infantis was 6.0-6.5, with an activity of 51.2% at 37C; similar to that observed for B.pseudocatenulatum. Other Bifidobacteria species present in the human gut have also phytase activity, although to a lesser extent.

InsP6 antinutrient effect

Typically, InsP6 and fiber occur together in whole foods. This is problematic for analyzing the antinutrient effect of InsP6 as there is evidence that fiber also reduces mineral bioavailability (6). When given alone in animal models, InsP6 does not show toxic effects on bone minerals (7):


This suggests that its the combination of fiber and InsP6 which causes the antinutrient effect observed. 

The type of fiber seems to be important on mineral bioavailability. The addition of FOS to a diet high in InsP6 improves cecal absorption of minerals and stimulates bacterial hydrolysis of InsP6 (8, 9), counteracting the negative effects of high doses of InsP6. Inulin has also shown to improve calcium balance and absorption (10). The importance of the fiber type on the effects of phytic acid is highlighted by a study in which healthy women following the recommended daily intake of fiber-rich wheat bread (300g/day) showed impaired iron status independent of the phytic acid content (11).

Anti-cancer properties

InsP6 is a broad-spectrum antioneoplastic agent in vitro and in vivo (12). Structurally, InsP6 is similar to D-3-deoxy-3-fluoro-ptdIns, a potent PI3K inhibitor. Accordingly, InsP6 is able to inhibit PI3K and ERK phosphorylation (13), thereby inhibiting AP-1 activation. InsP6 has also been shown to activate PKC delta and decrease phosphorylation of Erk1/Erk2 and Akt, causing upregulation of p27-Kip1 and reduction of pRb phosphorylation (14). Other protective effects include the induction of apoptosis by inhibiting the Akt-NFkB pathway and increasing cytochrome C release (15), downregulation of constitutive and ligand-induced mitogenic and cell survival signaling (showing different effects on ERK1/2, JNK1/2 and p38 in response to different mitogens) (16), its antioxidant effect (17), enhancement of NK cell activity (18), modulation of expression of TNF-alpha and its receptors genes (19), inhibition of angiogenesis (20) and metastasis, by modulation of integrin dimerization, cell surface expression and integrin-associated signaling pathway (lack of clustering of paxilin and reduced FAK autophosphorylation) (21, 22). Utilization of InsP6 has been shown to offer some benefits during chemotherapy (23) and future trials are on their way.

Are whole grains inherently unhealthy?

Because whole-grains and legumes are high in phytic acid, it is plausible to hypothesize that intake of these foods will reduce to some extent the risk of developing cancer. Whole-grain intake has been associated with reduced risk of cancers (24, 25) as well as intake of legumes (26). However, some studies have found no association (27, 28). Because of the nature of these studies, it is not possible to draw causative conclusions. Most people eating the supposedly healthy foods have low intakes of harmful foods, so the decreased risk in some studies might be due to the exclusion and not the inclusion of some foods. In either case, most studies have not observed an increased cancer risk associated with these foods*. Other food sources rich in phytic acid include nuts and cocoa. 

Conclusions

The dangers of phytic acid have been overestimated. Contrary to popular the paleo belief, phytic acid might be beneficial in small doses and might have anticancer effects. As seen with gluten degradation by Rothia species, the phytase activity present in some exclusive human Bifidobacteria shows that adaptation to wheat/grains is indeed happening. Once again, the microbiota plays a dominant role.

From epidemiological data, foods with high phytate content are not associated with increased risk for several chronic diseases. As association doesnt means causation, we cannot conclude that whole-grains are healthy but we cant also conclude that whole-grains are unhealthy. With the increasing attention to paleolithic and similar diets, it is of utmost importance that all evidence is critically analyzed and reviewed. Making unsupported statements and cherry-picking data would only cause rejection by scientists. Dogma is not good in science (or in anything else, for that matter).

I dont recommend whole-grains and legumes because there are foods more nutritious, as well as because whole-grains and legumes are very high in carbohydrates. The potential benefits of phytate can be obtained by eating other phytate rich foods, such as nuts and cocoa; as well as soluble fiber and oligosaccharides as the main dietary fiber type. The problem with high levels of phytate is only relevant when the diet is deficient in micronutrients and essential food sources. Finally, maintaining a proper gut flora is essential for phytic acid metabolism and adequate mineral absorption. 

*Any evidence of a significant increased risk from these foods would be greatly appreciated.

ResearchBlogging.orgHaros M, Carlsson NG, Almgren A, Larsson-Alminger M, Sandberg AS, & Andlid T (2009). Phytate degradation by human gut isolated Bifidobacterium pseudocatenulatum ATCC27919 and its probiotic potential. International journal of food microbiology, 135 (1), 7-14 PMID: 19674804

Haros M, Bielecka M, Honke J, & Sanz Y (2007). Myo-inositol hexakisphosphate degradation by Bifidobacterium infantis ATCC 15697. International journal of food microbiology, 117 (1), 76-84 PMID: 17462768

Thursday, October 27, 2011

Bifidobacteria, butyrate and carbohydrates

In a previous post, john asked:
Regarding your old post on ketogenic diet and microbiota, why do you think bifidobacterium decreased on low carb? I would generally guess this is a negative...?
I cited two studies on low carbohydrate dieting and gut microbiota composition, one by Duncan et al (1) and the other by Brinkworth et al (2). They showed a negative effect of reducing carbs on gut flora, measured by species composition (16S RNA) and SCFA. They both analyzed fecal samples. In general, fecal samples are reliable and make easier to study colonic SCFA metabolism. However, they are an indirect method of quantification. Of the three main SCFA produced in the colon, only acetate has shown a correlation between fecal concentration and absorption (3):

Copyright © 2011 by the American Society for Nutrition


The data shows a negative correlation (r=-0.834) between acetate absorption from an infusion and fecal acetate concentration. This means that the fecal concentration of acetate might reflect absorption rather than production, in an inverse manner (less acetate in fecal samples equals more absorption). In this study, neither propionate or butyrate showed a correlation between absorption and fecal concentration. 

SCFA in the Duncan et al. study

Acetate, butyrate and propionate concentrations in fecal samples from the Duncan et al. study are shown below:

SCFA concentrations (mM) for fecal samples. M=Maintenace; HPMC= High-protein, moderate-carbohydrate; HPLC = High-protein, low-carbohydrate. Mean values.

As the intake of carbohydrate decreased, there was a parallel reduction in all three SCFA. 

SCFA in the Brinkworth et al. study

Acetate, butyrate and propionate concentrations in fecal samples from the Brinkworth et al. study are shown below:

Fecal SCFA concentrations (mM) after 8 weeks of either a low carbohydrate (LC) or high carbohydrate (HC) diet. Mean values. 

As seen in the Duncan et al. study, after 8 weeks with a low-carbohydrate diet, SCFA concentrations were reduced, although not as drastically. 

Analysis and interpretation of the data

Both studies show a clear correlation between carbohydrate intake and SCFA concentration in fecal samples. The magnitude of the changes between individual SCFA might be due to differences in the intervention time (4 weeks vs. 8 weeks). 

As shown by Vogt and Wolever (see above), acetate concentration in fecal samples reflect more precisely acetate absorption rather than production. Thus, lower fecal acetate levels with reduced carbohydrate reflect more acetate absorption (or utilization, see below). 

Most focus has been given to the apparent reduction in butyrate levels, which may compromise colonic health. In this regard, low carbohydrate diets might be detrimental for colonic health because of reduced butyrate production. For assessing the validity of this statement, we must look at colonic butyrate metabolism. 

Colonic butyrate metabolism 

Approximately, 95% of the butyrate produced in the colon is absorbed. This is why fecal concentrations are not a good guide to production rates: a very high proportion of the SCFA is taken up by the colonic mucosa (4). Butyrate is produced from two molecules of acetyl CoA, yielding acetoacetyl CoA, which is further converted to finally butyryl CoA. This metabolite can be converted to butyrate via butyrate kinase or butyryl CoA:acetate CoA transferase.

Butyrogenic substrates include starch, inulin and xylan. But certain species are capable of producing butyrate from acetate. Synthesis of butyrate from acetate is performed via the butyryl CoA:acetate CoA transferase pathway, which seems to be the most prevalent route of butyrate synthesis by human gut bacteria (5). So, while glucose is needed for butyrate synthesis, acetate seems to be the main substrate for butyrate formation. The predominance of butyrate synthesis from the acetate dependent pathway might reflect a selective advantage for bacteria which transform acetate to butyrate in the colon, where acetate concentrations are high. 

Overall, the reduction in acetate and butyrate fecal concentrations may be translated to increased absorption and reduced excretion. Butyrate can be synthesized from acetate, which reduces the concentration of both SCFA in feces. The determined Km for butyrate transport in the colon has been found to be 14.8 +/-3.6 mM (6) and 17.5 +/- 4.5 mM in the proximal colon (7). The apparent saturation kinetics showed by butyrate transport across the colonic luminal membrane could further explain the results seen in the studies mentioned above: increasing the carbohydrate content in the diet would augment the number of glucose-dependent butyrogenic bacteria, increasing the colonic production and concentration of butyrate. Because transport of butyrate is saturable, excess butyrate is excreted, producing increased levels in feces. 

The case for Bifidobacteria

The change in Bifidobacteria concentrations after the low carbohydrate diet is due to the presence of an important number of bacteria capable of degrading glucose/starch. In this scenario, reduced carbohydrate availability would reduce the number of total Bifidobacteria (at least certain species). This does not mean that this is bad per se. It is important to determine the specific Bifidobacteria which are responsive to diet. For instance, B.longum seem to be capable of catabolizing not only dietary oligosaccharides, but also glycoproteins and glycoconjugates from the host; as well as nucleotides (8). Moreover, gut Bifidobacteria (as shown by the genomic analysis of B.longum), are capable of adapting to different carbohydrate substrates depending on their availability (9). In addition, metabolic-crossfeeding occurs between Bifidobacteria and other species. For example, E.hallii, a butyrogenic bacterium, is unable to grow on pure starch by itself. Co-culture of this bacterium with B.adolescentis stimulates its growth and butyrate synthesis, paralleled by a reduction in lactate levels (10). The scheme is pretty simple: B.adolescentis is capable of fermenting starch, producing lactate which serves as substrate to E.hallii. Other lactate-independent mechanisms of cross-feeding have also been observed in FOS and oligofructose-only co-cultures of B.longum with Roseburia intestinalis or Anaerostipes caccae, which are cabaple of producing butyrate and consuming acetate (11).

Because of its complexity, the specific mechanisms by which certain Bifidobacteria could be beneficial are unknown, although there is evidence of health benefits from increasing gut Bifidobacteria (12, 13, 14). There are some issues with interpreting the evidence in this topic:

  •  Many authors don't determine the exact species being studied (take all Bifidobacteria as a group).
  • Supplementation is done with different strains and the long-term effects are not known, because bacteria supplemented via diet are treated as allochthonous. 
  • Genomic inspection has shown that Bifidobacteria are metabolically very flexible. Adaptation to substrate variations might take longer than 8 weeks. 
  • There is metabolic-crossfeeding occuring between bacteria. This is a highly complex network of connections for which we are only starting to get an initial picture. 

Having this in mind, I cant assure either that a low carbohydrate diet is not harmful to the gut microbiota. As far as the evidence goes, we can only speculate and formulate hypotheses. And useful hypotheses should be based on logic and evolutionary inference. We should ask not only "how" but also "why". In this case, we are not going to focus on the "how" but on the "why"; to put it formally, why an increased intake of starch is associated with an increase in Bifidobacteria? What is the evolutionary basis?

One important protective role of Bifidobacteria is preventing colonization of enteropathogens by reducing their adhesion to intestinal epithelial cells. This has been shown directly for E. coli and S. typhimurium (15). Other commonly problematic Enterobacteriaceae include Klebsiella and Shigella. Growth of these pathogens is stimulated by high glucose-low oxygen conditions. The selective advantage of having responsive Bifidobacteria in the gut might be protection. As increased glucose concentrations favor the development of an adequate environment for growth of these pathogens, there has to be a mechanism by which the composition of the normal microbiota is maintained. So there is a parallel increase in Bifidobacteria with increasing concentrations of dietary carbohydrates to restrain colonization of pathogenic anaerobes. The fact that certain species of Bifidobacteria can metabolize different oligosaccharides and adapt to the substrate availability supports this hypothesis. I might elaborate more on this in subsequent posts. 

Conclusions

Low carbohydrate diets seem to reduce the fecal concentration of SCFA in the short term. Some adaptation seems to occur, judging by the differences between the study periods (4 weeks vs. 8 weeks). Fecal concentrations of SCFA are not good indicators of SCFA colonic production. Conversely, they rather reflect excretion (butyrate) and absorption (acetate). Butyrate can be produced from different substrates, of which acetate is the main precursor in the human gut. There is a reduction in the levels of Bifidobacteria detected in stool samples, proportional to the decrease in carbohydrate in the diet. Although no individual species where identified, studies have shown that Bifidobacteria are capable of adapting to substrate availability and cross-feed with other bacteria. The evolutionary basis for increased Bifidobacteria in response to sugar might involve a protective mechanism against colonization of enteropathogenic bacteria, such as E. coli, Klebsiella, Shigella and Salmonella.

ResearchBlogging.orgDuncan SH, Belenguer A, Holtrop G, Johnstone AM, Flint HJ, & Lobley GE (2007). Reduced dietary intake of carbohydrates by obese subjects results in decreased concentrations of butyrate and butyrate-producing bacteria in feces. Applied and environmental microbiology, 73 (4), 1073-8 PMID: 17189447

Brinkworth GD, Noakes M, Clifton PM, & Bird AR (2009). Comparative effects of very low-carbohydrate, high-fat and high-carbohydrate, low-fat weight-loss diets on bowel habit and faecal short-chain fatty acids and bacterial populations. The British journal of nutrition, 101 (10), 1493-502 PMID: 19224658

Friday, October 21, 2011

Rothia to the rescue

Gluten is problematic. Almost every paleo advocate agrees that wheat should be restricted in the diet because gliadin peptides generated by the uncomplete digestion of gluten produces highly reactive epitopes, which then can trigger a T-cell response in the gut. 

The main issue with the "gluten is bad for everyone" meme is that not all people develop gluten sensitivity or celiac disease. Many can tolerate great doses of wheat-based foods for years without serious health consequences. This is often attributed to lack of an environmental trigger which increases susceptibility to gliadin peptides (ie. inflammation). 

A recent paper (1) has found that two strains from the genus Rothia, namely R.mucilaginosa and R. aeria are capable of metabolizing gluten. These are oral microorganisms.

After isolation on a gluten agar media, these microbes were capable of hydrolizing YPQ tripeptides (which occur with high frequency in gluten sequences) and KPQ. Moreover, R.aeria degraded gliadin in vitro

SDS-PAGE of aliquots from the incubation mixture
The arrow shows the major protein constitutent in the gliadin mixture. As it can be seen, gliadin was progressively degrated (lanes 2-7, figure A). Shorter time intervals (lanes 2-7, figure B) show that almost 50% of gliadin was degraded in 30 minutes. Boiling bacterial suspensions abolished degradation (lanes 8 and 9, figure B). This suggests enzyme denaturation. Lanes 10-11 and 12-13 served as negative and positive controls, respectively. 

Proteolytic degradation of two problematic peptides (a-gliadin derived 33-mer and y-gliadin derived 26-mer) was compared between mammalian enzymes (pepsin, trypsin, chymotrypsin) and R.aeria:

RP-HPLC of sample aliquots
As it can be seen, chromatograms from all mammalian enzymes show the same pattern at 0 and 24h, showing no digestion of the peptides (A-C). In contrast, the sample containing R.aeria (WSA-8, figure D) showed the presence of different peaks earlier in the chromatogram, representing degradation fragments which elute earlier. At 2 hours, the peptide was completely degraded. 

These results show that enzymes present in R.aeria are capable of degrading gliadin and two peptides (33-mer and 26-mer) which are resistant to mammalian enzymes. Analysis by Mass Spectrometry determined that cleavage was made after QPQ and LPY for R. aeria and XPQ and LPY for R.mucilaginosa (X denotes any aminoacid). This is important because these tripeptides are part of the immunogenic epitopes contained within the 33-mer (glia-a9, glia-a2) and 26-mer peptides (glia-y2):

glia-a9: LQLQPFPQPQLPY
glia-a2: PQPQLPYPQPQLPY
glia-y2: PFPQQPQQP / PYPQQPQQP

It is worth noting that cleavage from both Rothia strains was observed also after Q residues along the 26-mer sequence. Repeated Q residues (along with P residues) are responsible for resistance to proteolysis by mammalian enzymes. 

Zymography at pH 7.0 showed that the putative gliadin-degrading enzymes had a molecular weight of approximately 75kDa and 70kDa for R.mucilaginosa and R.aeria, respectively. Further analysis on the activity of these enzymes at different pH revealed that enzymes from R.mucilaginosa were completely inactive at pH 3.0, while R.aeria maintained a weak enzymatic activity. The optimal pH determined was 7.0, and substrate hydrolysis rates declined in parallel with decreasing pH values from 7.0 to 4.0. At pH 3.0, R.aeria showed a much slower activity, while at pH 2.0, activity was completely abolished. 

Summary

R.mucilaginosa and R.aeria were capable of degrading gliadin and immunogenic peptides in vitro.
- The enzymes present have an approximate molecular weight of 70-75kDa. 
- Degrading activity of R.aeria was maintained at pH 3.0. 
- Both species are normal colonizers of the oral cavity and other areas (R.mucilaginosa).

From the discussion (my bolds):

"R. aeria (...) is an oral colonizer [31]. R. mucilaginosa also primarily colonizes the oral cavity [33] but has furthermore been isolated from other body sites, including the upper respiratory tract and the duodenum [34,35,36]"

"Bacterial speciation of 2,247 clones recovered from 63 duodenal biopsies obtained from healthy and celiac patients showed that R. mucilaginosa comprised [aprox.] 6% of the clones and was present in [aprox.] 65% of the biopsies, identifying it as a true colonizer of the duodenum [36]."

The million dollar question:

"The discovery of salivary microorganisms degrading dietary proteins in vitro prompts the question to what extent such microorganisms play a role in food processing in vivo. During mastication (chewing) foods are mixed with whole saliva helping to accelerate the break-down by digestive enzymes during the residency time in the oral cavity. Oral microorganisms in the swallowed food bolus may or may not survive and/or continue to exert proteolytic activities during or after gastric passage. Our in vitro data with R. aeria show that its enzymes are not abolished at acidic pH values, and are optimally active under more basic pH conditions. In vivo, this could mean that during gastric passage the enzymes will neither be active nor destroyed, and that enzymatic reactivation would occur upon transfer to the duodenum.

With regard to duodenal Rothia enzyme activity, it is relevant that R. mucilaginosa gains a foothold in the duodenum [36]. This offers the intriguing possibility that Rothia may colonize the duodenum and perform proteolytic activities locally in conjunction with mammalian- derived enzymes to degrade gluten."

The study here presented is a follow-up of one published in 2010 by the same group (2) in which they found that oral bacteria were capable of degrading completely immunogenic gliadin peptides. 

Further studies should help elucidate detailed information about the enzymes responsible for gluten degradation by these bacteria. 

This relationship might offer a way in which we are evolving and adapting to foods introduced in the neolithic: not only by changes in genes and gene expression (ie. AMY1), but also by establishing new symbiotic relationships with microorganisms. 

ResearchBlogging.orgZamakhchari M, Wei G, Dewhirst F, Lee J, Schuppan D, Oppenheim FG, & Helmerhorst EJ (2011). Identification of rothia bacteria as gluten-degrading natural colonizers of the upper gastro-intestinal tract. PloS one, 6 (9) PMID: 21957450

Tuesday, October 11, 2011

Food and antibiotic resistance genes

Antibiotic resistance (AR) and horizontal transmission of resistance genes is a major health problem in the modern world. Antibiotic abuse and dysbiosis seem to be the main causes, but an interesting potential source of these genes is just being explored. I came across a paper which analyzed the level of AR genes in common retail foods (1). 

The specific markers examined were ermB, ermC, tetS/M and tetA, which encode ribosomal modification and tetracycline (Tet) efflux mechanisms. Foods analyzed included different kinds of cheese (Cheddar, Swiss, Colby, Mozzarella), yogurt, raw milk, shrimp, pork chop, deli turkey, deli beef, mushroom and spinach. They found AR microbes in nearly all samples, either from raw or ready-to-eat foods. The only foods which showed no detectable AR microbes were processed cheese (heat treated during manufacture) and yogurt. 20 of the 23 cheese samples contained Tetr and/or Emr* (tetracycline and erythromycin resistance genes, respectively), and the number of Tetr microbes was greater than that of Emr in this food. 

The presence of selected AR genes from cheese and milk was analyzed by conventional PCR. It was found that:

- Among Tetr isolates recovered from cheese, about 10% contained the tetS/M gene. 7 out of 11 tetS/M+ were Staphylococcus thermophilus and two isolates were Lactococcus lactis. Two additional isolates had 97% 16S rRNA gene sequence identity to unidentified Lactococcus sp. and 93-94% identity to Lactococcus garvieae and Lactococcus lactis, similar to an isolate from milk. This suggests that Lactococcus might be a common organism from milk. Another tetS/M+ isolate from raw milk was identified as Leuconostoc sp. Two isolates from cheese presented the tetA gene, as well as several isolates from raw pork meat; and were identified as Pseudomonas sp. 

- Among Emr isolates from cheese, more than 50% had the ermB gene. The carrier organisms identified were Staphylococcus sp. (5 out of 28) and S. thermophilus ( 23 out of 28). One isolate, characterized as Pseudomonas sp., from packaged sliced chicken lunchmeat contained the ermC and tetS/M genes, suggesting a multidrug resistance phenotype. 

MIC analysis showed resistance to tetracycline, erythromycin, clarithromycin and clindamycin in several isolates, some of them showing multidrug resistance.

Because of the importance of horizontal gene transfer of AR genes to the human microbiota, the authors tested the potential transfer between some strains identified to resident oral bacteria. They used Staphylococcus mutans as recipient, which is a cariogenic oral pathogen. The plasmid (20-25kb) isolated from bacteria from food samples was succesfully transferred to S. mutans, confirmed by PCR amplification. MIC test showed that the transformed S. mutans had significantly increased resistance to tetracycline compared with the parental strain.  

The presence of AR genes in other foods and the horizontal transfer of resistance genes to the human microbiota has been confirmed in other studies (2, 3, 4, 5). The common food source of most of these studies is cheese/dairy, which shows the higher number of CFU and AR gene heterogeneity because of the presence of lactic acid bacteria. This is also relevant for fermented foods. 

The picture gets worse if we consider that the presence of AR bacteria and multiple resistance genes have been found in microbiota of breast-fed babies without previous exposure to antibiotics, as well as in some breast-milk samples (6, 7). 

In my opinion, antibiotic resistance is the main public health problem nowadays. While it may be smart to avoid certain foods which have potential carriers, the problem dont lies in the presence of AR genes per se, but in the horizontal transfer between AR bacteria and human commensal bacteria, which can then transfer resistance genes to pathogenic bacteria that are present in the human microbiota. These AR genes can also be transferred to pathogenic bacteria that can colonize the human digestive tract. As persistence of AR genes in the absence of antibiotic selective pressure has been observed, it seems that some AR bacteria might have a fitness advantage over wild-type strains. Thus, in evolutionary terms, reducing the exposure to antibiotics would not solve the problem (at least not in the short term) because the AR phenotype and the compensatory mutations associated are part of the evolution (compensatory evolution) of bacteria. 

Having this in mind, the most prudent preventive measure is maintaining a healthy immune system and gut microbiota via nutrition. Competition among bacterial populations is an important aspect of pathogen colonization. In this way, dysbiosis contributes to the establishment of pathogenic colonies in the gut. Avoiding any exposure to environmental pathogens is a double edge sword: you are avoiding the bad ones but also the good ones. 

ResearchBlogging.orgWang HH, Manuzon M, Lehman M, Wan K, Luo H, Wittum TE, Yousef A, & Bakaletz LO (2006). Food commensal microbes as a potentially important avenue in transmitting antibiotic resistance genes. FEMS microbiology letters, 254 (2), 226-31 PMID: 16445749

Tuesday, October 4, 2011

Ketogenic diet and STZ-induced diabetes

High fat diets cause diabetes. At least this is what we are told. Researchers frequently use streptozotocin (STZ) to induce diabetes in experimental animals. So, following the logic, a low carbohydrate ketogenic diet (LCKD) plus STZ would make rats extremely diabetic, with a very reduced chance to survive in the long term. 

So let's see what happens when STZ-treated rats are fed a normal chow diet (ND), a LCKD and a high carbohydrate diet (HCHO) (1). The macronutrient ratios for the latter were (C/F/P): LCKD 10/60/30 and HCHO 70/10/20.

Bodyweight remained constant in the LCKD group, while it was reduced significantly in the HCHO and ND groups. In the latter, after the administration of STZ, blood glucose (BG) increased from 105mg/dL at baseline to 650mg/dL at the end of the experimental period. In contrast, the LCKD group maintained BG levels around 100mg/dL. Food intake also was drastically increased in HCHO and ND groups, showing polyphagia. The LCKD rats showed a little increased in food intake, then decreased and remained constant during the whole study. Water intake was also constant in the LCKD compared to HCHO and ND. Urine output was also increased in the latter groups. (Remember the "three P's" of diabetes: polydipsia, polyphagia and polyurea). Glucosuria after STZ injection reached 1000mg/dL. However, LCKD showed negative glucosuria. Summing up: LCKD rats didnt show any marker of diabetes comared to HCHO and ND rats. They maintained calorie intake, weight and BG levels normal. No polydipsia, polyphagia or polyurea. 

One recent study warned about the mechanism by which high fat diets could cause diabetes and beta-cell dysfunction. Yes, this is the famous study by Ohtsubo et al (2). For a more comprehensive review of this study please refer to the one written by Denise Minger. In a nutshell, what the authors found was that elevated concentration of free fatty acids (FFA) caused nuclear exclusion and reduced expression of FOXA2 and HNF1A transcription factors in beta cells. This resulted in depletion of GnT-4a glycosylation and glucose transporter expression, leading to beta-cell dysfunction. This is one mechanism by which lipotoxicity contributes to diabetes onset. However, STZ causes cell death in pancreatic beta-cells through methylation, the release of free radicals or by the formation of nitric oxide. The mechanism found by Ohtsubo might be reversible. Beta-cell destruction might not. This is one of the most important problems with advanced diabetes, and might be involved in the evolution of type 2 into type 1 diabetes (3). Thus, studies using models of beta-cell destruction might be more relevant for understanding the basis of autoimmune or chronic uncontrolled diabetes.

This leads us to the most interesting part of the resent study. The authors assessed the histology of the Langerhans islets in the different rats by H&E staining

Copyright © 2010 Elsevier GmbH. All rights reserved.

(a) and (b) show the sections of the pancreas from control HCHO and ND rats. Circles show islets of Langerhans and arrows show vacuoles. (d) and (e) are from diabetic HCHO and ND rats, respectively. As can be seen, there is almost no islet left after STZ administration. On the other hand, diabetic LCKD rats showed no reduction of islets compared to LCKD controls ((c) and (f)). 

To further assess the efect of the different diets on beta-cell destruction, the authors used Gomori's Chrome Alum Haematoxylin-Phloxine stain. 

Copyright © 2010 Elsevier GmbH. All rights reserved.

Beta-cells are stained blue, alfa-cells are stained red and delta-cells are stained pink. (a), (b) and (c) are control ND, HCHO and LCKD; (d), (e) and (f) are diabetic ND, HCHO and LCKD, respectively. 

Overall, there was a clear protection against beta-cell destruction in the diabetic LCKD rats, compared to diabetic HCHO and ND rats. However, the number of beta-cells in control rats was not different between groups. 

How can a ketogenic diet can prevent the onset of diabetes induced by STZ and a high-fat diet cause diabetes? Isnt a ketogenic diet a high-fat diet? First, a high-fat diet is not necessarily a ketogenic diet. The term "high-fat diet" is used without a consensus in the literature, so a high sugar-high fat diet might be promoted as a high-fat diet (this is why is EXTREMELY important to read the methods). Second, lipotoxicity is a major cause of metabolic dysfunction. However, lipotoxicity doesnt implies a high-fat diet. It implies dysregulation of lipid metabolism. If anything, a ketogenic diet should restore a normal lipid metabolism. Third, there is a difference in comparing in vitro results with in vivo results. I have highlighted the importance of this distinction before. Finally, diabetes is a highly complex disease. I believe that the most serious cases have definitely an immune component, so there is targeted destruction of beta-cells. The authors speculated that the ketogenic diet prevented diabetes by the antioxidant effect of ketone bodies (because one of the cytotoxic effects of STZ in beta-cells is mediated by the increase in free radicals).

In conclusion, saying that high-fat ketogenic diets cause diabetes is as silly as saying that high carbohydrate diets cause diabetes. There is an extreme metabolic flexibility present in healthy humans, which can adapt to a wide range of macronutrient ratios. Food toxins, as stressed by other authors are another source of problems, which can confound the effect of different diets. 


People with diabetes might benefit from low carbohydrate diets not only by the proximate effect of the diet (less dietary glucose, which treats the symptom, not the cause), but because of calorie restriction, which alleviates lipotoxicity. This can also be achieved with an hypocaloric high-carbohydrate diet. But with people in with autoimmune type I diabetes, LADA, or severe cases of type 2 diabetes, a ketogenic diet could prevent progression of the disease more efficiently, preventing oxidative stress-mediated cell death. 

ResearchBlogging.orgAl-Khalifa A, Mathew TC, Al-Zaid NS, Mathew E, & Dashti H (2011). Low carbohydrate ketogenic diet prevents the induction of diabetes using streptozotocin in rats. Experimental and toxicologic pathology : official journal of the Gesellschaft fur Toxikologische Pathologie, 63 (7-8), 663-9 PMID: 21943927

Sunday, October 2, 2011

Good worms

(bioquicknews.com)
Schistosoma mansoni (left) is a very interesting worm. In fact, it is one of the "Old Friends" with which we have co-evolved. This parasite is the responsible for the development of schistosomiasis, to which no vaccine has been developed yet. 

What seems most interesting about S.mansoni is its capacity to modulate the human immune system. While in the host, each adult worm pair releases 200-300 eggs each day, which maturate during 5-6 days. Maturation involves forming the Von Lichtenberg's envelope on the inside of the egg shell from which proteins are secreted. The host reacts to the eggs and the proteins secreted, mounting a Th2-type immune response. One group of proteins present in the egg are soluble egg antigens (SEA). The exact mechanism by which SEA and other proteins secreted affect the immune response is not entirely clear. Nevertheless, it seems to involve C-type lectin receptors (CLR), as most antigenic proteins identified so far are glycoproteins with some conserved glycans (1).  

One characteristic of parasitic helminths like S. mansoni is that they have developed many strategies to supress the host response, thereby controlling the inflammatory response. This has been a mechanism acquired through co-evolution with its mammalian hosts, like humans. As parasite driven infection induces a Th2-type immune response, it can be hypothesized that some parasites can modulate the onset and progression of Th1-type autoimmune diseases, like type I diabetes. 

NOD mice is a model for autoimmune type I diabetes. When infected with S.mansoni or exposed to SEA, these mice do not develop diabetes (2).  To try to elucidate the mechanism, Zaccone et al (3) examined the effect of SEA on Foxp3+ Treg levels. Now, Treg cells are very important for controlling the immune response (hence the name Treg-regulatory T cells). One of the most important roles of these cells is establishing self-tolerance. Any failure of self tolerance would potentially lead to the development of autoimmunity. What has been observed is that injection of SEA to NOD mice increases the population of CD4+Foxp3+T cells in the pancreas. SEA generates Foxp3+ T cells from naïve CD4 T cells in a TGF-b dependent fashion. This effect was shown to happen directly and indirectly (by upregulation of CLR on dendritic cells), which suggests a synergistic response. Specific CLR upregulated by SEA include galectins 1 and 3, SIGN-R1 and DEC-205. Moreover, SEA is capable to alternatively activate macrophages (4), consistent with the effect of other helminths (5). Depletion of CD25+ T cells from splenocytes of SEA-treated NOD mice restored their ability to transfer diabetes to recipient animals, supporting the role of Treg in diabetes prevention in SEA-treated NOD mice. In contrast, depletion of CD25+ T cells from splenocytes of serum worm antigen (SWA)-treated NOD mice did not restore the ability to transfer diabetes to recipient NOD mice. This suggests that SEA and SWA protect against diabetes by different mechanisms, and that protection from SEA is mediated primarily by Tregs.

The protective effect of S.mansoni on other Th-1 type autoimmune diseases has also been observed. For instance, helminth infection seems to protect against the course of multiple sclerosis (MS) (6). Although this study did not correlate infection with S.mansoni and the severity of MS exclusively, animal models of experimental autoimmune encephalitis (EAE) (an animal model for MS) infected with S.mansoni exhibit reduced levels of proinflammatory cytokines and CNS inflammation (7, 8). Exposure to S.mansoni eggs and/or infection also protects mice from TBNS-induced colitis (animal model for Chron's disease) (910), collagen-induced arthritis (animal model for rheumatoid arthritis) (11), Grave's disease (12), and allergic diseases (13, 14). This challenges the Th1/Th2 dichotomy and suggests that there are other mechanisms by which S.mansoni modulates the immune system. 

The rise in inflammatory and autoimmune conditions may not only have a nutritional basis. Lack of exposure to some essential parasites for the development of the immune system by adoption of extreme hygiene practices during early infancy may increase the risk of allergic and autoimmune diseases. A healthy lifestyle should not only include avoiding industrialized foods, but also being exposed to different pathogens which help shape our immune system. 


ResearchBlogging.orgZaccone P, Burton O, Miller N, Jones FM, Dunne DW, & Cooke A (2009). Schistosoma mansoni egg antigens induce Treg that participate in diabetes prevention in NOD mice. European journal of immunology, 39 (4), 1098-107 PMID: 19291704