Enlarge / Painting by Israel Bernbaum depicting Jewish children in Warsaw Ghetto and in the death camps (1981).Monclair State University collection
During the Nazi occupation of Poland during World War II, Jewish residents in Warsaw were forcibly confined to a district known as the Warsaw Ghetto. The crowded, unsanitary conditions and meager food rations predictably led to a deadly outbreak of typhus fever in 1941. But the outbreak mysteriously halted before winter arrived, rather than becoming more virulent with the colder weather. According to a recent paper in the journal Science Advances, it was measures put into place by the ghetto doctors and Jewish council members that curbed the spread of typhus: specifically, social distancing, self-isolation, public lectures, and the establishment of an underground university to train medical students.
Typhus (aka "jail fever" or "gaol fever") has been around for centuries. These days, outbreaks are relatively rare, limited to regions with bad sanitary conditions and densely packed populations—prisons and ghettos, for instance—since the epidemic variety is spread by body lice. (Technically, typhus is a group of related infectious diseases.) But they do occur: there was an outbreak among the Los Angeles homeless population in 2018-2019.
Those who contract typhus experience a sudden fever and accompanying flu-like symptoms, followed five to nine days later by a rash that gradually spreads over the body. If left untreated with antibiotics, the patient begins to show signs of meningoencephalitis (infection of the brain)—sensitivity to light, seizures, and delirium, for instance—before slipping into a coma and, often, dying. There is no vaccine against typhus, even today. It's usually prevented by limiting human exposure to the disease vectors (lice) by improving the conditions in which outbreaks can flourish.
A scourge for centuries
Something very like typhus was first described in 1489 CE during the War of Granada, in which the Spanish army reported losing 17,000 men to disease. In 1577, an assizes held in Oxford, England (now known as the Black Assizes), led to an outbreak that killed over 300 people after infected prisoners were brought into the court and spread the disease to the members. By 1759, nearly a quarter of English prisoners were dying of gaol fever. There were fatal outbreaks during Napoleon's retreat from Moscow in 1812, during the Irish famine between 1816 and 1819, in Philadelphia in 1837, and all along the Eastern Front during World War I.
Yet another outbreak spread through much of Europe during the Russian Revolution. An estimated 30 to 40 million people contracted the disease in Russia alone, according to co-author Lewi Stone of RMIT University and Tel Aviv University, and between 3 million and 5 million died. Typhus proved to be an equally deadly scourge during World War II, particularly in Nazi-occupied cities and concentration camps. (Anne Frank and her sister, Margot, died of typhus at Bergen-Belsen at the ages of 15 and 19, respectively.)
Nearly 450,000 Jewish residents were packed into the 3.4 kilometers of the Warsaw Ghetto and rationed a meager 200 calories per day, with little soap and water to keep clean. "With poor conditions, rampant starvation and a population density 5 to 10 times higher than any city in the world today, the Warsaw Ghetto presented the perfect breeding ground for bacteria to spread typhus, and it ripped through the mainly Jewish population there like a wildfire," said Stone. "Of course, the Nazis were well aware this would happen." The paper cites a 1941 document by ghetto commissar Heinz Auerswald noting a "quantum leap" in May deaths, for example, and the situation became so bad that the streets were littered with human corpses covered in newspapers.
Stone et al. point out that the widespread extermination of Jews was partly triggered (or at least rationalized) by public health concerns—a convenient pretext to commit genocide. The paper quotes an October 1941 statement by Jost Walbaum, chief health officer of occupied Poland, calling Jews "carriers and disseminators" of typhus, and offering two solutions. "We sentence the Jews in the ghetto to death by hunger or we shoot them…. We have one and only one responsibility, that the German people are not infected and endangered by these parasites." The authors note that his comments were met with enthusiastic applause, adding, "Today, more than ever society needs to grasp how a virus or bacterium can create utter havoc, dragging humankind to this terminal point of evil."
"Today, more than ever, society needs to grasp how a virus can create utter havoc, dragging humankind to this terminal point of evil."
Stone is a mathematical biologist who has been modeling diseases for decades, a research area that includes reconstructing past epidemics and pandemics, like the Black Death that ravaged Europe in the 14th century, the Spanish Flu of 1918, or more recently, the outbreak of Zika in Brazil before the 2016 Olympic Games. He came across an article that mentioned outbreaks of typhus during World War II and wanted to learn more. After he found some data on typhus in the Warsaw Ghetto, he plotted it on his computer.
It proved challenging to find additional information, however. Ghetto residents often avoided reporting such diseases. That was because the Nazis typically responded with extreme measures, like injecting phenol into the hearts of those who were sick, killing them instantly, or burning a hospital to the ground, patients still inside, because they were infected with typhus.
That said, according to Stone, there were many experienced doctors housed in the Warsaw Ghetto, some of whom survived the war, and they documented the various measures taken to battle the typhus outbreak. He visited libraries all over the world, scouring their archives for relevant documents that might provide more details about the kinds of strategies deployed.
"We know that in other towns of the region, typhus continued on through the winter unabated," said Stone, citing historical records. "So it was odd that just in the Warsaw Ghetto, the disease should die out before winter when it was expected to accelerate. Thus, we are fairly confident that the intervention succeeded." He admitted being surprised by that finding and initially assumed it was the result of a corrupted data set. But the diary of Polish historian and Warsaw Ghetto resident Emanuel Ringelbaum provided corroborating evidence. Ringelbaum documented the day-to-day happenings in the ghetto and reported a 40-percent drop in the epidemic rate at that time, calling it "irrational."
Stone and his co-authors thought their mathematical models might shed some light on this oddity. The model revealed that there must have been some kind of behavioral change factor, since without it, the epidemic would have peaked in the middle of winter (January 1942) and been as much as two to three times larger. Epidemics typically collapse when there aren't enough susceptible (uninfected) people in a given population to sustain the spread. But less than 10 percent of the Warsaw Ghetto residents had been infected when the outbreak died down in late October of that year.
The ghetto doctors and council members encouraged (and even enforced, when necessary) general good hygiene and cleanliness, despite the terrible conditions. They encouraged social distancing, and those sickened were quarantined. The community even managed to open up soup kitchens, smuggling in extra food to augment their rations. There were public lectures to educate the residents about the importance of such measures and even a rudimentary underground university to train new doctors. Stone et al. estimate that these measures likely prevented as many as 100,000 people from contracting typhus and tens of thousands from dying of the disease.
Nonetheless, Stone was astonished at the sheer number of typhus cases predicted by their model—100,000 infected people over the course of the epidemic—compared to the official reported numbers. And the official number of recorded deaths from typhus and starvation didn't match what he was reading in diaries and reports of the ghetto's epidemiologists, corroborated by a mathematical analysis of food ration cards (the subject of a forthcoming paper) that were handed out to all the ghetto residents each month. "We believe there were far more deaths in 1941 than realized," Stone said, mostly due to typhus, starvation, or both combined, since the two formed a deadly feedbackRead More – Source
Spain’s competition watchdog, the ‘Comisión Nacional de los Mercados y la Competencia’ (CNMC) has opened a disciplinary case against Google for alleged anti-competitive practices affecting publishers and Spanish news agencies, it said in a statement on Tuesday.
CNMC said it was investigating whether Google had abused its dominant position in the Spanish market. The proceedings involve Google LLC, Google Ireland Ltd, Google Spain, SL., and the overall parent company Alphabet Inc.
The alleged practices also include distorting free competition and imposing unfair conditions on press publishers and Spanish news agencies, CNMC said.
The watchdog’s investigation was sparked by a complaint from the Spanish Reproduction Rights Centre (CEDRO).
CNMC will investigate the case over the next 18 months, during which both sides can present their arguments.
According to RTVE, Spain’s national broadcaster, Google will analyse the file and respond to the ‘doubts’ of the CNMC. They said that Google ‘works constructively with publishers in Spain and Europe’ and would ‘need time to analyse the details … as the nature of the claims is still not clear’.
It is not the first action by the Spanish competition regulator against Google, nor the first in which its dominant position in the media sector stands out. In 2021, CNMC already warned that this company and another technology giant, Amazon, monopolised 70% of internet advertising in Spain.
Other lawsuits in the Netherlands and the UK have previously accused the technology company of abusing its dominance in the digital advertising market to harm its competitors. France also fined Google in 2021 for not negotiating in good faith compensation for the media for using its news content.
Technology has dramatically changed the way we read and write in the 21st century. From e-books and online articles to social media and instant messaging, technology has made reading and writing more accessible and convenient. However, it has also brought about new challenges and concerns.
One of the biggest benefits of technology is the increased access to information. With just a few clicks, people can access an endless supply of books, articles, and other written materials from all over the world. This has made reading and writing more accessible for people who may not have had the opportunity to do so in the past. It has also allowed for greater collaboration, as people can now share their writing and receive feedback from a global audience.
Technology has also made writing and reading more interactive. Social media and blogs have made it possible for people to engage with written content in real-time, sharing their thoughts, opinions, and experiences with others. This has led to a more dynamic and engaged reading and writing community, with people able to communicate and connect with each other in new and meaningful ways.
However, there are also concerns about how technology is affecting our ability to read and write. One of the biggest concerns is the decline of attention span. With so much information available at our fingertips, it can be difficult to stay focused and absorb what we are reading. Many people find it difficult to concentrate on longer written works, and are instead drawn to shorter, more bite-sized pieces of content.
Additionally, technology has led to an increase in informal writing. The widespread use of text messaging and instant messaging has led to the widespread use of shorthand and abbreviations. This has created concerns about the impact it may have on people’s writing skills, as well as the way they communicate with others.
Another concern is the rise of “fake news.” With the ease of publishing content online, it has become increasingly difficult to differentiate between credible and unreliable sources. This has led to a decline in trust in the media, and has created a need for critical thinking and media literacy skills.
Despite these concerns, technology has also provided new opportunities for writing and reading. E-books and online platforms have made it easier for people to self-publish their work, giving them greater control over the distribution and promotion of their writing. This has created a more democratized publishing industry, and has made it possible for voices and perspectives that may have previously been excluded to be heard.
In conclusion, technology has had a profound impact on reading and writing. While there are certainly challenges and concerns, the increased access to information, the ability to connect and engage with others, and the opportunities for self-publishing have all made reading and writing more accessible and dynamic. As technology continues to evolve, it will be important to address the challenges it presents and embrace the opportunities it provides.
Measuring human intelligence is a complex task that has been attempted by many experts and researchers over the years. Intelligence is often defined as an individual’s ability to think, reason, and solve problems. However, this definition is not enough to capture all the aspects of intelligence. In this article, we will look at some of the ways that human intelligence can be measured and evaluated.
Intelligence Quotient (IQ) Tests: IQ tests are the most commonly used method of measuring intelligence. They are designed to measure an individual’s ability to solve problems, think logically, and understand abstract concepts. The results of an IQ test are expressed as an IQ score, which is a number that represents a person’s intellectual abilities in comparison to the general population.
Achievement Tests: Achievement tests are designed to evaluate an individual’s knowledge and skills in specific subjects such as mathematics, reading, or science. These tests can be a good indicator of a person’s intelligence in a particular subject area and are often used in schools and colleges to assess students’ abilities.
Neuropsychological Tests: Neuropsychological tests are used to evaluate the functioning of the brain and nervous system. These tests can be used to diagnose neurological disorders, measure cognitive abilities, and determine the impact of injury or illness on a person’s cognitive abilities.
Cognitive Ability Tests: Cognitive ability tests are designed to measure an individual’s mental abilities such as memory, reasoning, and problem-solving. These tests can be useful in determining a person’s potential for learning and development.
Behavioral Assessment: Behavioral assessment involves evaluating an individual’s behavior, including their social skills, emotional regulation, and communication abilities. This type of assessment can be useful in identifying areas where an individual may need support or intervention.
Performance-Based Tests: Performance-based tests are designed to measure an individual’s abilities in real-world tasks and activities. These tests can be useful in determining a person’s practical intelligence and can be used in a variety of settings, including schools, workplaces, and healthcare facilities.
It is important to note that no single method of measuring intelligence is perfect and each has its own strengths and limitations. Additionally, the results of intelligence tests can be influenced by many factors such as cultural background, education, and experience. As a result, it is important to use a variety of assessment methods to get a more comprehensive understanding of an individual’s intelligence.
In conclusion, measuring human intelligence is a complex task that involves evaluating a variety of cognitive, behavioral, and performance-based abilities. While intelligence tests can provide valuable information about a person’s intellectual abilities, it is important to use a variety of assessment methods to get a more comprehensive understanding of an individual’s intelligence. By using a combination of tests, experts and researchers can get a more complete picture of an individual’s intellectual abilities and potential for learning and development.