Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog


Channel Description:

the work of history
    0 0



    Paul Caponigro
    “Egyptian Figure, Museum of Fine Arts, Boston,” 1962

    Museum of Fine Arts, Boston


    0 0



















    Dante Gabriel Rossetti
    May 12, 1828 – April 9, 1882


    0 0







    May 12, 1940: Norman Whitfield is born.

    Norman Whitfield was an unlikely star. By his own account he got into music because he “saw Smokey Robinson driving in a Cadillac.” A Harlem native, he would not move to Detroit, Michigan, until he was nineteen, and it had been, fatefully, Detroit, only because“his father’s car broke down there” on the way back from his grandmother’s funeral. After landing almost accidentally in this birthplace of the still-embryonic company known then as Tamla, Whitfield pestered, and impressed, Tamla’s ambitious founder, Berry Gordy. He landed a job in the company’s quality control department and before long, he had worked his way into songwriting and production. By then, Tamla had become Motown.

    Described as “arguably the first black/African-American producer auteur,” Whitfield would go on to mastermind, behind the scenes at Motown, some of the most popular songs to ever come out of Hitsville USA and some of the most beloved songs of the era, full-stop: “Cloud Nine,” “Ball of Confusion,” “Just My Imagination (Running Away with Me),” “I Heard It Through the Grapevine,” “Papa Was a Rolling Stone,” “War” (later, under his own label, for a different sound and time, he would write and produce Rose Royce’s classic “Car Wash.”)

    After establishing his hitmaking credibility, Whitfield crafted his own corner within Berry Gordy’s Motown machine. At Motown, the bottom line was, generally, the bottom line, which did not always necessarily encourage eclecticism let alone experimentation. But in this space, Whitfield stitched into his work the storms, in culture and sound, of the times, and introduced a little modern grit into the polish of product. He worked most famously and fruitfully (though not always cordially) with the Temptations, whom he helped direct toward a sound sometimes referred to as ‘psychedelic soul.’ Whitfield had at first hand-waved the sounds of psychedelia as faddish, its gimmicks as purely gimmicky, but he would end up cementing and disseminating them outside of rock as much as anyone.

    Eventually he delved so deeply that his experimentation necessitated an entirely separate outfit, which became the group the Undisputed Truth. Some of the Temptations had become disgruntled with Whitfield’s focus on production and his rich instrumentation, which meandered along its own sense and disregarded their vocals. They would hardly have agreed to become the vehicle for his productions, so there was much reason in this; for example, the Temptations’ version of “Ball of Confusion” ran a standard 4 minutes. The version that appeared on The Undisputed Truth (1971) allowed Whitfield the space of a trippy 10 minutes. Whitfield relished this experimental, indulgent work, but he also produced lean masterpieces, such as Marvin Gaye’s “I Heard It Through the Grapevine.”

    In their earliest years, the Temptations were best known for sweet, smooth love songs like “My Girl,” working in the general spirit of the highly pop-conscious ‘Motown Sound.’ In 1968, Dennis Coffey, a Motown studio guitarist, recalled this exchange, which gave shape to the Temptations’ wonderfully skittering “Cloud Nine:” During a recording session, Coffey pulled out a wah-wah pedal. The now-iconic effect was still earning its associations with the hippie late 1960s, as “psychedelia’s non-psychotropic aid,” onstage at Woodstock and San Francisco, in the music of Jimi Hendrix, Cream, Sly and the Family Stone. Whitfield might have thought Sly Stone a fad, at first, but Coffey started playing, and Whitfield heard, and he knew: “That’s what I’m looking for!”

    The wah-wah was only one signature, but it was a defining one. Several of Whitfield’s compositions would feature session guitarist Melvin Ragin (better known as “Wah Wah” Watson) and his “chugging, funky, wild wah-wah groove:” in, for example (and perhaps most famously and most gloriously), “Papa Was a Rolling Stone.”


    0 0







    May 16, 1920: Pope Benedict XV canonizes Joan of Arc.

    At dawn on May 16, 1920, St. Peter’s Basilica was aglow with the slow, soft light of wax candles. Already a massive pilgrimage had gathered in Vatican City, around the basilica, to witness the canonization of Joan of Arc. The audience would swell to around 70,000; among the group of official attendees milled ambassadors to the Holy See bedecked in full diplomatic costume, French and Russian aristocrats, papal courtiers.

    Nearly half a millennium after the nineteen year old Jeanne was put on trial, burned thrice at the stake, excommunicated, and declared a martyr, Pope Benedict XV stood at the head of this ceremony and attested to “the sanctity of the bravest maiden within the recollection of men and the most innocent,” and by papal decree “forever [erased] from memory the stain of her unjust condemnation.”

    Twenty-five years after her condemnation, Joan had been quickly redeemed, her executioner excommunicated, and in the near and distant afterlife she was claimed as patron to French nationalists; Catholics of all nationalities; 19th century painters and poets of Romance; Latin American right-wingers and Mexican soldaderas; modern feminist progressives and Marine Le Pen; all of these alike.

    She sustained an unmatched folk following, which only grew, with a particular Romantic gusto through the 1800s, until the formal process toward sainthood began at the turn of the century. Despite her popular appeal, the road was not entirely paved—her life and deeds were picked apart under the highest standards, a strange and difficult task given the opacity of the historical record, and of the nature of divine communion itself. She had attacked Paris on the Nativity of Mary and thereby desecrated a feast day; she lied, had broken oaths, attempted to escape imprisonment, and there were “doubts as to her chastity.” She was heroic, and admirable, even divinely so, certainly, but saintly?

    Her advocates argued that tactical circumstances had driven her to attack Paris on that particular day, and also that her “voices” had not interceded to stop her - so God, clearly, had no qualms about disrupting a feast day. It was in this way that “spiritual lawyers” debated her merits and demerits, alternating between historical accounts, primary sources, and estimations of her character, and this single confounding variable of God’s will. One devil’s advocate argued that “God… allowed Joan to be taken because of the pride with which she was puffed up and because of the extravagant clothes she wore.” A defender argued that she “committed no sin” in donning “male garb,” since she believed she had done so on God’s command. Because she feared and even shunned death, she could not be considered a perfect martyr.

    Her beatification and canonization, like her trial and burning, came finally in a detonation of political buildup. The latter was pushed through, at the end of World War I, partly as an “attempt to recapture the larger public imagination” as the Church faced down a “tide of socialist, anticlerical thought.” In 1891 Pope Leo XIII had issued an encyclical that attempted to define the ancient institution’s place in a sea of modernism, capitalism, science, socialism, war. Thirty years later the Church was still thinking. Joan was refashioned once again, for another purpose.

    Another party sat before the Pope, in May 1920, as he delivered his pronouncement. It was a motley group, comprising around “140 descendants of the family of Joan of Arc,” placed in a prominent tribune. They belonged “to all ranks of life” and had travelled “from all parts of France.” Between her execution and canonization they had scattered across France and regathered in St. Peter’s Basilica, five hundred years later. The elusive historical-mythical Joan had herself been scattered and revived and transformed over centuries and was transforming still then before their eyes. She was regarded in her own time as both heretical and saintly. John of Lancaster reported that his soldiers faltered and deserted English forces in droves out of their “unholy fear… of a disciple and limb of Satan, called the Maid, who employed spells and sorcery.” 

    Another contemporary wrote that the young Jeanne - not the saint nor the witch nor even the commander but the girl - “weeps easily and abundantly. Her face radiates joy.”


    0 0

    “Dicki Leitsch, an early gay-rights activist, who is now in his eighties, arranged to donate his old working files to the archives of the New York Public Library… Jason Baumann, an assistant director for collection development and the LGBTQ initiative coordinator for the NYPL… took inventory… “This card file is great,” he said, flipping through a set of four-by-six index cards on which Leitsch had nearly typed out gay slang terms from antiquity. In the seventies, an “Alice Blue Gown” was a uniformed police officer. A “basket” was “the bulge caused by the organs when wearing tight pants.” Some of the definitions were more nuanced: an “auntie,” Leitsch had written, was “an ageing or middle aged homosexual, offtimes effeminate in character,” or “a person of settled demeanor who cautions against intemperate acts.””

    - “Retrospective Dept.: The Advocate.”

    0 0


    Linda and Terry Lynn Brown; Carl Iwasaki/The LIFE Images Collection


    On the steps of the Supreme Court, Nettie Hunt explains to her daughter, Nikie, the meaning of Brown v. Board; LOC


    Plaintiffs in Davis v. County School Board of Prince Edward County.


    Linda Brown, age 9, walks past the Sumner School in Topeka, in 1953.

    May 17, 1954: In Brown v. Board of Education, the Supreme Court rules the “separate but equal” doctrine of segregation unconstitutional.

    Linda Brown, who died earlier this year at 75, was seven years old when her father Oliver enrolled her at the Sumner School for the fall semester of 1950. Linda was entering the third grade that fall, and her mother Leola recalled later that “her daddy told her he was going to try to do his best to do something” so that Linda could attend this school, so close to home, alongside “the white children [she] played with.” 

    The Browns lived in an integrated neighborhood in Topeka, and the Sumner School was only a few blocks down the street — and, like many of the elementary schools in Topeka, and Kansas’ other large cities, it was all-white.

    Oliver Brown, for whom Brown v. Board of Education was named, was both a railroad welder and a pastor. Linda remembered later the walk home after the Sumner School turned her away, how she “could just feel the tension within him.” But Oliver did not fall into a historical legal fight by accident; and, though his name was the one that labels the case for history, he was not an individual crusader, Linda explained, he was simply “like a lot of other black parents here in Topeka at that time,” frustrated and ready to fight for his child.

    He and Leola were one pair of Topeka parents who served as plaintiffs in an NAACP class action lawsuit to dismantle the state’s longstanding elementary school segregation. The first parent to volunteer herself on behalf of her child was Lucinda Todd, the mother of another seven-year-old. There were nineteen children and thirteen plaintiffs in all, all of whom enrolled their children in nearby segregated schools, and all of whom were denied, and they were nineteen among thousands. In 1950, Topeka’s school districts operated eighteen schools for the twenty percent white residents and four for the remaining eighty percent.

    Outside Kansas and across the country, similar suits were underway. When Brown reached the Supreme Court it had merged with those suits. Brown now included not only Linda Brown and the children of Lucinda Todd and Lena Carper and Sadie Emmanuel and Marguerite Emmerson, but also Harry Briggs of Summerton, South Carolina; Rosa Bell Davis of Prince Edward County, Virginia; the Bollings of Washington, D.C.; Ethel Belton of Claymont, Delaware; and dozens of others

    In Hockessin, Delaware, bus routes would not even stop to pick up Sarah Bulah’s daughter, Shirley, and take her to the Bulahs’ “separate but equal” school — a one-room brick a quarter the size of the nearby whites-only school. 

    On May 17, 1954, the Warren Court handed down the unanimous decision that “in the field of public education, the doctrine of ‘separate but equal’ has no place.” “Separate but equal” had justified the 1896 Plessy v. Ferguson decision and six cases in the field of public education since. “Separate but equal” had served as the legal basis for segregation in not only public education but every facet of public life for half a century, and it “[had] no place.” Such separation, at least in education, was “inherently unequal.”

    It was not so much, wrote Warren in the opinion, the tangible differences between individual schools, though they demonstrably existed. The Court considered that the act of separation itself “generates a feeling of inferiority…. in the community that may affect their hearts and minds in a way unlikely ever to be undone.” Thurgood Marshall, later Justice Marshall, had argued that “segregation thus necessarily imports inequality.”

    Brown empowered the nascent national civil rights movement—it had seemingly destroyed segregation’s legal foundation; Warren hoped that the court’s unanimity would leave no room for doubt. But the furious cultural and political war over desegregation that followed split political parties. It took place in courtrooms and outside them, involved luminaries like Thurgood Marshall and people like the Browns who were “like a lot of” their neighbors. White extremists rampaged; white moderates demurred. Segregation evolved.

    By the time the Brown decision was handed down, four years after Linda Brown was turned away at Sumner, Linda was attending an integrated junior high school. Leola reported: “She was very happy.”


    0 0


    yadir_copca/Flickr


    ismael villafranco/Flickr


    yadir_copca/Flickr


    Wikimedia Commons


    yadir_copca/Flickr

    Ixmiquilpan [in Mexico] … didn’t always ingratiate itself with outsiders. Its name means “place where the pigweed cuts like knives.” In 1548, when Augustinian friars arrived to convert the local Otomi, they used forced labor to build [the Church of San Miguel Arcangel]. The results may not have been what they expected. All around us in the sanctuary, crumbling frescoes reached up into the nave: centaurs and griffins, eagle knights and coyote warriors. 

    The Otomi hadn’t just repurposed Christian imagery; they’d replaced it with their own. Instead of angels and saints, there were soldiers beheading one another. Instead of Madonnas and Christs, there were pregnant women sprouting from acanthus buds.

    “Bean Freaks: On the hunt for an elusive legume.”


    0 0

















    Fifty years ago this spring, Stanley Kubrick’s confounding sci-fi masterpiece, 2001: A Space Odyssey, had its premieres across the country.. Onscreen it was 2001, but in the theatres it was still 1968, after all. Kubrick’s gleeful machinery, waltzing in time to Strauss, had bounded past an abundance of human misery on the ground.

    Hippies may have saved 2001. Stoned audiences flocked to the movie. David Bowie took a few drops of cannabis tincture before watching, and countless others dropped acid. According to one report, a young man at a showing in Los Angeles plunged through the movie screen, shouting, “It’s God! It’s God!”

    M-G-M thought it had on its hands a second Doctor Zhivago (1965) or Ben-Hur (1959), or perhaps another Spartacus (1960), the splashy studio hit that Kubrick, low on funds, had directed about a decade before. But instead the theatres were filling up with fans of cult films like Roger Corman’s The Trip, or Psych-Out, the early Jack Nicholson flick with music by Strawberry Alarm Clock.

    These movies, though cheesy, found a new use for editing and special effects: to mimic psychedelic visions. The iconic Star Gate sequence in 2001, when David Bowman, the film’s protagonist, hurtles in his space pod through a corridor of swimming kaleidoscope colors, could even be timed, with sufficient practice, to crest with the viewer’s own hallucinations. The studio caught on, and a new tagline was added to the movie’s designed posters: “The ultimate trip.”

    “Anybody There?: Fifty years later, the tedium and the triumph of 2001: A Space Odyssey.”


    0 0





















    Julius Klinger
    May 22, 1876 – ?1942*

    *Klinger, who was Jewish, was deported to Minsk in 1942, and died sometime that year.


    0 0







    May 24, 1844: Samuel Morse telegraphs “WHAT HATH GOD WROUGHT?” from the U.S. Capitol to his assistant, Alfred Vail, in Baltimore.

    Morse’s message inaugurated the world’s first commercial telegraph line, which ran roughly forty miles between Washington, D.C. and a northern terminus at the Mount Clare railroad station in Baltimore, Maryland. That distance was miniscule, a mere half-step up along the Chesapeake, compared to the thousands and thousands of miles of electrical wires that would very soon cross the continent and, not long after that, span oceans.

    On the other hand, it was an enormous distance compared to that travelled in Morse’s first public demonstration in 1838: two miles across a village in New Jersey. It was an unimaginable feat, in both distance and speed, compared to human communication in hitherto all of human history. The transcontinental telegraph systems famously sunk the Pony Express. The touch of a hand bore the power, by a spark, of communication. By 1870, over three million messages had been sent by cable in the United States.

    The death of the mail-bearing horseman was not the only ripple — or upheaval — the telegraph wrought. Henry Adams wrote in his Education that “eighteenth-century troglodytic Boston” came into modern, industrial America with “the opening of the Boston and Albany Railroad; the appearance of the first Cunard Steamers in the bay; and the telegraphic messages which carried from Baltimore to Washington the news that Henry Clay and James K. Polk were nominated for the presidency” in 1844.

    One historian argues that the telegraph fundamentally shaped late 19th century and early 20th century diplomacy: with message times cut down from months to days and hours, instructions, reports, decisions and public reactions flowed nonstop; diplomats maneuvered with more information and more speed and, perhaps, less tact, than ever.

    Morse was not the sole inventor of the telegraph, or even of the code that bears his name. Samuel P. Morse was by training not a scientist but an artist; he was also a propagator of anti-Catholic Nativism and of pro-slavery screeds, even as a Northerner at the height of the Civil War, at slavery’s very end. He bathed both his technological and political visions in the Divine: slavery was salvation, abolition a sin, and his telegraph - a Biblical revelation. What hath God wrought? 

    This mixture of piety and progress, writes James W. Carey, was characteristic of these industrial-modern times; within “the rhetoric of the electrical sublime” resided “a central tenet of middle-class ideology: that ‘communication, exchange, motion brings humanity, enlightenment, progress and that isolation and disconnection are evidence of barbarism and merely obstacles to be overcome’ (Schivelbusch, 1978: 40) … Each improvement in communication, by ending isolation, by linking people everywhere, was heralded as realizing the Universal Brotherhood of Universal Man.”

    In light of this Morse’s vision for the kind of modernity ushered in by telegraph seems naively and appropriately lofty, and utterly dissonant. In 1855 an undersea cable was being laid between Newfoundland and Nova Scotia, the first connection for a trans-Atlantic telegraph cable; in 1855 Morse, between anti-immigrant and pro-slavery proselytism, wrote to a friend:

    The affects of the Telegraph on the interests of the world, political, social, and commercial, have, as yet, scarcely begun to be apprehended, even by the most speculative minds. I trust that one of its effects will be to bind man to his fellowman in such bonds of amity as to put an end to war. I think I can predict this effect as in a not distant future.


    0 0



















    To return to Mr. de Mille’s analogy, lighting is like music: for with identically the same resources at hand, no two artists work the same way, even though their results may in the end prove all but identical. So, too, cinematographic lighting has its Mozarts and its Wagners—its artists who specialize in light, delicate tones, and others who prefer the sweeping effect, the crashing crescendo… 

    This, in turn, necessitates the intrusion of the personal pronoun. If I do a thing one way, it does not follow that it is what John Seitz, or Karl Struss, or George Barnes would do. It does not follow that my way is the only way: it is simply the method that my experience and my personal inclinations suggest… 

    Personally I have always felt that the problem of lighting is generally approached from the wrong angle. Instead of approaching any given set or action with the one question, “How shall I light this?” I prefer to approach it with the thought of “What compositions can I make with this set and this action?” Then I proceed to make those compositions—and the lighting automatically takes care of itself.

    Notes from Chinese-American film pioneer James Wong Howe on the art of lighting
    Cinematographic Annual, Vol. 2, 1931

    Internet Archive


    0 0



















    unhistorical:

    To return to Mr. de Mille’s analogy, lighting is like music: for with identically the same resources at hand, no two artists work the same way, even though their results may in the end prove all but identical. So, too, cinematographic lighting has its Mozarts and its Wagners—its artists who specialize in light, delicate tones, and others who prefer the sweeping effect, the crashing crescendo… 

    This, in turn, necessitates the intrusion of the personal pronoun. If I do a thing one way, it does not follow that it is what John Seitz, or Karl Struss, or George Barnes would do. It does not follow that my way is the only way: it is simply the method that my experience and my personal inclinations suggest… 

    Personally I have always felt that the problem of lighting is generally approached from the wrong angle. Instead of approaching any given set or action with the one question, “How shall I light this?” I prefer to approach it with the thought of “What compositions can I make with this set and this action?” Then I proceed to make those compositions—and the lighting automatically takes care of itself.

    Notes from Chinese-American film pioneer James Wong Howe on the art of lighting
    Cinematographic Annual, Vol. 2, 1931

    Internet Archive

    Today’s (May 25) Google Doodle honors James Wong Howe, who was born in Canton in 1899. He came to the United States in 1904, following his father, who had come, like so many laborers, to work on a transcontinental railroad line.

    Howe established himself as a cameraman during cinema’s early silent era and became, through the 1930s and 40s, one of the industry’s preeminent cinematographers. He was barred from U.S. citizenship until the 1943 repeal of the Chinese Exclusion Act and from legally wedding his wife under longstanding miscegenation laws. He and his wife also came under HUAC scrutiny for suspected Communist ties, but his reputation and work withstood—Howe remained prolific until his death in 1976 and was, by then, a ten-time Academy Award nominee.


    0 0


    Erich Hartmann/Magnum Photos







    Rachel Carson
    May 27, 1907 – April 14, 1964

    The spiral shells of other snails—these quite minute—left winding tracks on the mud as they moved about in search of food. They were born shells, and when I saw them I had a nostalgic moment when I wished I might see what Audubon saw, a century and more ago. For such little horn shells were the food of the flamingo, once so numerous on this coast, and when I half closed my eyes I could almost imagine a flock of these magnificent flame birds feeding in that cove, filling it with their color. It was a mere yesterday in the life of the earth that they were there; in nature, time and space are relative matters.

    The Edge of the Sea, Rachel Carson.

    With these surface waters, through a series of delicately adjusted, interlocking relationship, the life of all parts of the sea is linked. What happens to a diatom in the upper, sunlit strata of the sea may well determine what happens to a cod lying on a ledge of some rocky canyon a hundred fathoms below, or to a bed of multicolored, gorgeously plumed seaworms carpeting an underlying shoal. or to a prawn creeping over the soft oozes of the sea floor in the blackness of mile-deep water.

    The Sea Around Us, Rachel Carson.

    An experience like that, when one’s thoughts are released to roam through the lonely spaces of the universe, can be shared… even if you don’t know the name of a single star. You can still drink in the beauty, and think and wonder at the meaning of what you see.

    The Sense of Wonder, Rachel Carson.

    The winds, the sea, and the moving tides are what they are. If there is wonder and beauty and majesty in them, science will discover these qualities. If they are not there, science cannot create them. If there is poetry in my book about the sea, it is not because I deliberately put it there, but because no one could write truthfully about the sea and leave out the poetry.

    Under the Sea Wind, Rachel Carson.

    Saint Rachel, “the nun of nature, ” as she is called, is frequently invoked in the name of one environmental cause or another, but few know much about her life and work. “People think she came out of nowhere to deliver this Jeremiad of ‘Silent Spring, ’ but she had three massive best sellers about the sea before that, ” McKibben says. “She was Jacques Cousteau before there was Jacques Cousteau.” …. Carson believed that people would protect only what they loved, so she worked to establish a “sense of wonder” about nature… to articulate sophisticated ideas about the inner workings of largely unseen things. 

    “She wanted us to understand that we were just a blip, ” says Linda Lear, author of Carson’s definitive biography, “Witness for Nature.” “The control of nature was an arrogant idea, and Carson was against human arrogance.”… “Silent Spring” was more than a study of the effects of synthetic pesticides; it was an indictment of the late 1950s. Humans, Carson argued, should not seek to dominate nature through chemistry, in the name of progress.

    “How ‘Silent Spring’ Ignited the Environmental Movement”


    0 0









    “Vital forms” are shapes inspired by nature; innovative artists and designers used them in the 1940s and 1950s to evoke living entities, ranging from amoebas and plant life to the human figure… Every period has its own visual vocabulary, which it partly borrows from the past and partly invents to meet new needs. The language of vital forms expressed the dualities of its times: the hopes and fears, the dreams and nightmares, of the middle years of the twentieth century were reflected in organic forms that were highly mutable, seemingly as changeable as life itself […]

    The gravity of the war changed the course of American art and design as well. It made 1930s American Scene painting and Regionalism, which often showed an agrarian daily life, seem naïve and nostalgic, and WPA photographs outdated. At the same time, the free-thinking Surrealist artists who came to this country from Europe to escape Hitler exerted enormous influence. In this era of international crisis, American artists and designers often used organic forms, especially the human figure, as a way of reasserting humane values.

    Public awareness of the Atomic Age began with the horrendous explosions in 1945 over Hiroshima and Nagasaki that ended the war. It was not until 1955 that atomic energy became available for peacetime uses in the United States. During this time of uncertainty, artists responded to the atomic phenomenon with powerful abstractions […]

    With the onset of the Atomic Age, disturbing, mutant forms—the result of exposure to radiation—began to appear in films and novels. Yet at the same time, the playful, positive form of the atom’s structure, with electrons circling the nucleus , also became an integral part of the period’s imagery, in art as well as in domestic products. By the mid-1950s, some Americans were optimistic about the atom’s role as a new source of energy, a replacement for coal and oil in generating electricity.

    The Cold War between the United States and the Soviet Union and the resulting arms race, however, fed the persistent fear of nuclear destruction. The paradox remains with us today, as we continue to struggle with how to reconcile the positive and negative aspects of atomic energy.

    (Brooklyn Museum) Vital Forms: American Art and Design in the Atomic Age, 1940-1960


    0 0











    May 30, 1941: Manolis Glezos and Apostolos Santas, members of the Greek Resistance, climb the Acropolis of Athens and tear down the swastika.

    Manolis Glezos and Apostolos Santas were both nineteen years old when Wehrmacht soldiers marched south on Macedonia in April 1941. Three weeks later, when German tanks rolled into Athens and took the city and the country, the pair hatched a stunt, involving secret tunnels and torches, in defiance of the impending occupation.

    Greek troops had beaten back Italian forces that winter, but the new German onslaught quickly overwhelmed. Optimism born of the temporary victory over the Italian campaign was extinguished. The Germans, it was clear, came bearing a long, painful night. In mid-April, facing the imminent German entrance into Athens, Prime Minister Alexandros Koryzis shot himself, and King George II and his family fled to Crete before departing for exile in Great Britain. The thin spread of Allied forces left in Greece, badly outnumbered and depleted of resources, could stave off very little before they too departed. In the aftermath of this retreat, Italy took hold of the majority of the peninsula — but the country’s vital regions, including the small center within Attica containing Athens, fell under German administration.

    Walther Wrede, director of the German Archaeological Institute at Athens and leading Nazi party representative in Greece, was tasked with welcoming the occupiers to Athens. As the Wehrmacht took Athens and draped the city in Nazi colors, as Nazi flags unfurled over balconies and doorways, Wrede wrote: “I spring to our lookout post on the upper floor. The cry: ‘Swastika over the Acropolis!’ rings through the house… and thus prepared we stand at the windows waiting for the first German soldiers.”

    When Hitler wrote to Mussolini a year later regarding the Duce’s visit to Greece, he described the Acropolis as the “place where all that we today call human culture found its beginning.” Now the Nazi war flag — red emblazoned with black swastika, bars, and Iron Cross — flew above. According to wartime folk legend, the Greek soldier on guard at the Acropolis the first day of occupation chose to leap off the rock rather than lower the Greek flag and raise the swastika overhead.

    Apostolos Santas, who died in 2011, and Manolis Glezos, who was, until his resignation in 2015, the oldest member of the European parliament as a member of SYRIZA, were first-year college students in 1941. But both already had early political experience, having participated in anti-fascist organizing against Greece’s own authoritarian government and subsequently the Italian invasion.

    Under the cover of night, the two boys, wielding a torch and a pocketknife, crept into a cave whose mouth was hidden in undergrowth at the foot of the Acropolis. Together they climbed to the top of the Acropolis and surfaced near the Erechtheion. At the summit, bathed in moonlight, as Santas later described, they paused to look upon the temples and “became emotional.”

    They then scaled the flagpole and tore the Nazi flag from its post overlooking Athens. The two scaled down, embraced and “did a quick dance,” and escaped undetected. They kept a corner of the flag, the upper left carrying the Iron Cross, and discarded the remaining scraps down a well. The missing flag — and the weight of their act — was realized the next morning. German authorities publicly condemned the theft, inadvertently publicizing the brazen stunt to a dispirited public. They sentenced the perpetrators to death, though they had no idea who they might be. They would never learn, at least during the course of the occupation.

    Both Glezos and Santas survived the death sentence, and the war and the one after it, but not unscarred. Both continued their activities with communist factions of the Resistance, and both were imprisoned multiple times, first by German occupiers, later by their fellow Greeks during the partisan violence that erupted in the vacuum of the German departure. In 1963, a New York Times correspondent described Glezos, no longer a symbol of heroic Allied resistance but rather of the communist threat in Greece, as “heroic but dangerous.” Later, Glezos described that, despite their youth, theirs had been “a conscious act… The swastika on the Acropolis offended all human ideals.”


    0 0





















    Ellsworth Kelly
    May 31, 1923 – December 27, 2015

    The process by which the “already-made” shape is suddenly available to Kelly—while it escapes most of us—is one of defamiliarization, of what the Russian formalists called ostranenie. It came upon the young Kelly years before he became an artist, and the strong memories he has about several childhood experiences is perhaps the reason his work remains so fresh. I’ll quote two such memories, but there are many more:

    I remember that when I was about ten or twelve years old I was ill and fainted. And when I came to, my head was upside down. I looked at the room upside down, and, for a brief moment I couldn’t understand anything until my mind realized that I was upside down and I righted myself. But for the moment that I didn’t know where I was, it was fascinating. It was like a wonderful world.

    “Ellsworth Kelly’s Dream of Impersonality”


    0 0



    Man Ray
    “Laboratory of the Future,” 1935

    Museum of Modern Art


    0 0









    Lizzie Douglas AKA Memphis Minnie
    June 3, 1897 – August 6, 1973

    She was coal black beautiful, they say, with soft black hair she could fix any way she wanted to, and all gold teeth across the front. In joints, on the street, at house parties and fish fries, she picked and sang while chewing Brown Mule tobacco… She swore freely, dipped Copenhagen snuff, shot craps, gambled at cards, and bested Big Bill Broonzy in picking contests. In blues circles, she was rumored, respectfully, to have shot off the arm of a man who tried to mess with her, or she chopped it off with a hatchet… 

    When she came down with what was diagnosed as meningitis and yellow fever, and the doctors gave up on her, she wrote “Memphis Minnie-jitis Blues,” drank a quart of whiskey her husband brought to the hospital, and just sweated whatever it was out. She drank hard — gin, corn whiskey, potato-and-yeast home-brew, and Wild Irish Rose wine — but lived to be seventy-six.

    She held her own from Mississippi to Chicago, right through the Depression, in country blues and urban blues, acoustic and electric… She wrote any number of songs that stirred food, rue, relish, and sex together in roughly equal parts: ‘Keep On Eating’,” “Banana Man Blues,” “Lean Meat Don’t Fry.”

    She grew up in rural Mississippi near Memphis, started performing in that city, and moved back there in old age, but where the Minnie came from, nobody knows—she was born Lizzie Douglas… the relatives called her by her childhood name, Kid.

    “I’m so glad,” Minnie sings, “that I ain’t nobody’s tool.”… At its roots, the blues isn’t jaded. It’s as pretty as it can be…

    “Memphis Minnie’s Blues: A Dirty Mother For You,” Roy Blount, Jr.


    0 0













    Views at Abu Simbel, Nubia, Southern Egypt

    Brooklyn Museum


    0 0







    June 5, 1981: The CDC Morbidity and Mortality Weekly Report identifies strange cases of pneumonia in gay men in Los Angeles, later recognized as the first report of the AIDS crisis.

    First there were five: Each of the men were in their late twenties to thirties, three of them “previously healthy,” and each had come down with what the CDC said was Pneumocystis pneumonia, caused by Pneumocystis carinii. This was odd. “You only got Pneumocystis when something had kicked the bottom out of your natural immunities,” wrote Randy Shilts in And The Band Played On, the definitive chronicle of the public health crisis that unraveled over the next decades.

    Something had been lurking in the gay communities of cities from Los Angeles to San Francisco to New York. Diseases like toxoplasmosis, Kaposi’s sarcoma, and infections via cytomegalovirus and P. carinii, were felling perfectly healthy adults, as if they had suddenly become long-suffering cancer patients. Doctors had observed these seemingly disparate cases for years and, though these men had one thing in common - that is, that they were gay - a few observations of pneumonia or cancer could not be deemed an epidemic. Before the disease was named or its virus identified, it was speculated that this might be something environmental or ‘lifestyle’ related, like tainted poppers.

    Michael S. Gottlieb, a young immunologist at UCLA, and Wayne Shandera, a CDC field investigator based in Los Angeles, had been studying such cases. In April 1981 Gottlieb had phoned Shandera to discuss his patients, these gay men suffering from pneumonia, and they agreed—not on what it was, exactly, but that it was odd.

    These patients “in some fashion… were immunosuppressed,” Shandera observed, but what he observed made him think not of young men in their physical primes but “of cancer patients… children with lymphocytic leukemia,” and even “starvation victims after World War II.”

    The doctors decided that these findings and their explosive potential were too pressing to put through a lengthy academic peer review process. They turned instead to the CDC’s Morbidity and Mortality Weekly Report, which was not a formal academic journal but a quick, time-sensitive digest. Later that year Gottlieb would publish an official article in The New England Journal of Medicine; for now, the imperative was to get the word out. 

    The MMWR was widely distributed throughout the American medical community—whatever vital update appeared in the latest issue was immediately discussed across the country by doctors, researchers, hospitals, health departments, public health policymakers. 

    In the June 5, 1981 issue, under “Epidemiological Notes and Reports,” and following a dispatch about Dengue fever in Caribbean tourists, a brief and unsensational bulletin appeared. The title dryly read “Pneumocystis Pneumonia — Los Angeles.”

    Between October 1980 and May 1981, the MMWR reported, “five men aged 29-36 had been “treated for biopsy-confirmed Pneumocystis carinii pneumonia at 3 different hospitals in Los Angeles, California.” Two of the men had died. The report noted briefly that all of the men were “active homosexuals,” but also that none of them knew each other or any of the same people. The report did not speculate on what any of this meant. It was the first official inscription in the timeline of the AIDS crisis.

    The relationship between the medical establishment and a gay community emerging, for the first time, from secrecy and shame, remained complicated. The CDC did not want to report the potential outbreak as a “gay epidemic,” for fear of both alienating the communities and of stoking prejudice and hysteria against them. Shandera insisted that he knew that “the Centers for Disease Control, because it was infecting gay men, put [the report] on the second page of MMWR.”

    This tentative first report alerted the wider medical community to this mysterious outbreak. The CDC quickly assembled a task force, always aware of, accounting for, and constrained by the fact that the new Reagan administration had vowed to dramatically carve up the federal budget. 

    Researchers first called the disease GRID (gay-related immune deficiency), and by 1982 it was, alternatively, AID (acquired immunodeficiency disease). In 1982 a presidential spokesperson responded to a journalist’s inquiry (“…does the President have any reaction to the announcement – the Center for Disease Control in Atlanta, that AIDS is now an epidemic and [has] over 600 cases?”) that, to his knowledge, no one in the White House knew about an AIDS epidemic. Between June 1 and September 1982, the CDC received reports of nearly 600 cases of what, by late 1982, it officially referred to as AIDS.