Is the Afghan hat related to the Macedonian hat?

Is the Afghan hat related to the Macedonian hat?

This is a Greek king of Bactria (Afghanistan):

The fashion was followed after Alexander the Great. It is based on Macedonian berets ("kausia"):

This is an Afghan hat ("Pakol"):


Worn here by a Pakistani politician Imran Khan


Worn here by an Afghan political/militant leader Ahmad Shah Massoud

Is there a connection?


The short answer to this question is that there is indeed a plausible connection.

B.M. Kingsley (PhD) in 1981 already pointed to this connection as seen in the following abstract:

The so-called Macedonian kausia was originally identical with a cap often called a chitrali still worn today by men in Afghanistan, Pakistan and, above all, in Nuristan. No kausia is mentioned in Greek literature before 325/24 B. C. No depiction of the cap can be securely dated earlier than that time. The kausia came to the Mediterranean as a campaign hat worn by Alexander and veterans of his campaigns in India. Descendants of the people from whom the cap was taken may well survive in Asia today.

The author observed the hat of this specific type "chitrali" (Also known as Pakol) in Pakistan, Afghanistan and in the Nuristan province in the eastern of Afghanistan. It was noted as quite identical to the Kausia.

From Kingsley, Bonnie M. “The Cap That Survived Alexander.” American Journal of Archaeology, vol. 85, no. 1, 1981, pp. 39-46. JSTOR, JSTOR, www.jstor.org/stable/504964.

The author repeats the Indian origin of the Kausia in a later article which was published after death.

See Kingsley, Bonnie. “Alexander's 'Kausia' and Macedonian Tradition.” Classical Antiquity, vol. 10, no. 1, 1991, pp. 59-76. JSTOR, JSTOR, www.jstor.org/stable/25010941.

As conclusion: The connection is plausible, but it cannot be said with certainty.


8 Facts About Osama bin Laden's Final Hideout

It took nearly a decade following the September 11, 2001 terrorist attacks on New York City and Washington, D.C. for American intelligence authorities to realize that al Qaeda founder Osama bin Laden—mastermind of the 9/11 plot—hadn’t been skulking in a cave or a remote tribal area of Pakistan. For about the last five years of his life as a fugitive, his home had been a large compound in Abbottabad, shared with several wives and children and a handful of supporters. The location was scarcely a mile from the Pakistan Military Academy in Kakul.

How did the godfather of modern radical Islamic terrorism live during these years of self-enforced isolation? Very cautiously.

WATCH: The full episode of Revealed: The Hunt for Bin Laden online now.


The Irish Brigade

At the outbreak of the Civil War in 1861, thousands of Irish and Irish-American New Yorkers enlisted in the Union Army. Some joined ordinary—that is, non-Irish—regiments, but others formed three all-Irish voluntary infantries: the 63rd New York Infantry Regiment, organized on Staten Island, and the 69th and 88th New York Infantry Regiments, organized in the Bronx. These units would form the core of what would come to be called the Irish Brigade.

Did you know? After the Civil War, Thomas Francis Meagher became the Acting Governor of the Montana Territory. He drowned in the Missouri River in 1867.

Ethnic units were a way for the Union Army to help win Irish support for its cause. This support was not guaranteed: Though most Irish immigrants lived in the North, they were sympathetic to (as they saw it) the Confederacy’s struggle for independence from an overbearing government—it reminded them of their fight to be free of the British. Also, many Irish and Irish Americans were not against slavery. On the contrary, they favored a system that kept blacks out of the paid labor market and away from their jobs. As a result, Union officials had to promise many things in addition to ethnic regiments𠅎nlistment bonuses, extra rations, state subsidies for soldiers’ families, Catholic chaplains—in order to assure that the North’s largest immigrant group would be fighting with them and not against them.

In February 1862, an Army captain named Thomas Francis Meagher became the Brigadier General of the nascent Irish Brigade. Meagher was born in Ireland, where he had been active in the “Young Ireland” nationalist movement and exiled as a result to the British Penal Colony in Tasmania, Australia. He escaped from Australia in 1853 and came to the United States, where he became a well-known orator and activist on behalf of the Irish nationalist cause. He joined the Army early in 1861. Meagher was ambitious, and he knew that if he could raise an all-Irish infantry brigade, Union Army officials would have to make him its commander. He also hoped that an Irish Brigade in the U.S. would draw attention to the nationalist cause at home.

In the spring of 1862, Union Army officials added a non-Irish regiment, the 29th Massachusetts, to the Irish Brigade in order to beef up its numbers before the Peninsula Campaign for the capture of Richmond, Virginia, the capital of the Confederacy. In October, another Irish regiment, the 116th Pennsylvania Infantry Regiment from Philadelphia, joined the brigade in time for the battle at Harper’s Ferry, Virginia. The next month, officials swapped the non-Irish 29th Massachusetts Regiment for the Irish 28th Massachusetts.


Tajik

Tajiks are believed to have Iranian origins, and are also referred to as Farsi. They are the second largest ethnic group in Afghanistan, making up an estimated 27% of the nation’s population. They speak a Persian dialect known as Dari. According to a US State Department report released in 2009, Tajiks are 98% Sunni Muslims. Tajiks' meals range from sweet dishes such as Halwa to savory ones such as Pulao (spiced rice). Tajiks are famous for their elaborate embroideries on fabric. These beautiful patterns are also found on their carpets, wall hangings and head pieces. Decorative carvings on stone can be seen in Tajik homes.


Don’t Touch That Dial!

A respected Swiss scientist, Conrad Gessner, might have been the first to raise the alarm about the effects of information overload. In a landmark book, he described how the modern world overwhelmed people with data and that this overabundance was both “confusing and harmful” to the mind. The media now echo his concerns with reports on the unprecedented risks of living in an “always on” digital environment. It’s worth noting that Gessner, for his part, never once used e-mail and was completely ignorant about computers. That’s not because he was a technophobe but because he died in 1565. His warnings referred to the seemingly unmanageable flood of information unleashed by the printing press.

Worries about information overload are as old as information itself, with each generation reimagining the dangerous impacts of technology on mind and brain. From a historical perspective, what strikes home is not the evolution of these social concerns, but their similarity from one century to the next, to the point where they arrive anew with little having changed except the label.

These concerns stretch back to the birth of literacy itself. In parallel with modern concerns about children’s overuse of technology, Socrates famously warned against writing because it would “create forgetfulness in the learners’ souls, because they will not use their memories.” He also advised that children can’t distinguish fantasy from reality, so parents should only allow them to hear wholesome allegories and not “improper” tales, lest their development go astray. The Socratic warning has been repeated many times since: The older generation warns against a new technology and bemoans that society is abandoning the “wholesome” media it grew up with, seemingly unaware that this same technology was considered to be harmful when first introduced.

Gessner’s anxieties over psychological strain arose when he set about the task of compiling an index of every available book in the 16 th century, eventually published as the Bibliotheca universalis. Similar concerns arose in the 18 th century, when newspapers became more common. The French statesman Malesherbes railed against the fashion for getting news from the printed page, arguing that it socially isolated readers and detracted from the spiritually uplifting group practice of getting news from the pulpit. A hundred years later, as literacy became essential and schools were widely introduced, the curmudgeons turned against education for being unnatural and a risk to mental health. An 1883 article in the weekly medical journal the Sanitarian argued that schools “exhaust the children’s brains and nervous systems with complex and multiple studies, and ruin their bodies by protracted imprisonment.” Meanwhile, excessive study was considered a leading cause of madness by the medical community.

When radio arrived, we discovered yet another scourge of the young: The wireless was accused of distracting children from reading and diminishing performance in school, both of which were now considered to be appropriate and wholesome. In 1936, the music magazine the Gramophone reported that children had “developed the habit of dividing attention between the humdrum preparation of their school assignments and the compelling excitement of the loudspeaker” and described how the radio programs were disturbing the balance of their excitable minds. The television caused widespread concern as well: Media historian Ellen Wartella has noted how “opponents voiced concerns about how television might hurt radio, conversation, reading, and the patterns of family living and result in the further vulgarization of American culture.”

By the end of the 20 th century, personal computers had entered our homes, the Internet was a global phenomenon, and almost identical worries were widely broadcast through chilling headlines: CNN reported that “Email ‘hurts IQ more than pot’,” the Telegraph that “Twitter and Facebook could harm moral values” and the “Facebook and MySpace generation ‘cannot form relationships’,” and the Daily Mail ran a piece on “How using Facebook could raise your risk of cancer.” Not a single shred of evidence underlies these stories, but they make headlines across the world because they echo our recurrent fears about new technology.

These fears have also appeared in feature articles for more serious publications: Nicolas Carr’s influential article “Is Google Making Us Stupid?” for the Atlantic suggested the Internet was sapping our attention and stunting our reasoning the Times of London article “Warning: brain overload” said digital technology is damaging our ability to empathize and a piece in the New York Times titled “The Lure of Data: Is It Addictive?” raised the question of whether technology could be causing attention deficit disorder. All of these pieces have one thing in common—they mention not one study on how digital technology is affecting the mind and brain. They tell anecdotes about people who believe they can no longer concentrate, talk to scientists doing peripherally related work, and that’s it. Imagine if the situation in Afghanistan were discussed in a similar way. You could write 4,000 words for a major media outlet without ever mentioning a relevant fact about the war. Instead, you’d base your thesis on the opinions of your friends and the guy down the street who works in the kebab shop. He’s actually from Turkey, but it’s all the same, though, isn’t it?

There is, in fact, a host of research that directly tackles these issues. To date, studies suggest there is no consistent evidence that the Internet causes mental problems. If anything, the data show that people who use social networking sites actually tend to have better offline social lives, while those who play computer games are better than nongamers at absorbing and reacting to information with no loss of accuracy or increased impulsiveness. In contrast, the accumulation of many years of evidence suggests that heavy television viewing does appear to have a negative effect on our health and our ability to concentrate. We almost never hear about these sorts of studies anymore because television is old hat, technology scares need to be novel, and evidence that something is safe just doesn’t make the grade in the shock-horror media agenda.

The writer Douglas Adams observed how technology that existed when we were born seems normal, anything that is developed before we turn 35 is exciting, and whatever comes after that is treated with suspicion. This is not to say all media technologies are harmless, and there is an important debate to be had about how new developments affect our bodies and minds. But history has shown that we rarely consider these effects in anything except the most superficial terms because our suspicions get the better of us. In retrospect, the debates about whether schooling dulls the brain or whether newspapers damage the fabric of society seem peculiar, but our children will undoubtedly feel the same about the technology scares we entertain now. It won’t be long until they start the cycle anew.


Amānullāh Khan

Our editors will review what you’ve submitted and determine whether to revise the article.

Amānullāh Khan, (born June 1, 1892, Paghmān, Afghanistan—died April 25, 1960, Zürich, Switzerland), ruler of Afghanistan (1919–29) who led his country to full independence from British influence.

A favoured son of the Afghan ruler Ḥabībullāh Khan, Amānullāh took possession of the throne immediately after his father’s assassination in 1919, at a time when Great Britain exercised an important influence on Afghan affairs. In his coronation address Amānullāh declared total independence from Great Britain. This led to war with the British (see Anglo-Afghan Wars), but fighting was confined to a series of skirmishes between an ineffective Afghan army and a British Indian army exhausted from the heavy demands of World War I (1914–18). A peace treaty recognizing the independence of Afghanistan was signed at Rawalpindi (now in Pakistan) in August 1919.

Although a charming man and a sincere patriot and reformer, Amānullāh was also impulsive and tactless and tended to surround himself with poor advisers. Shortly after ascending the throne, he pushed for a series of Western-style reforms, including an education program and road-building projects, but was opposed by reactionaries. In 1928 he returned from a trip to Europe with plans for legislative reform and emancipation of women, proposals that caused his popular support to drop and enraged the mullahs (Muslim religious leaders). In 1928 a tribal revolt resulted in a chaotic situation during which a notorious bandit leader, Bacheh Saqqāw (Bacheh-ye Saqqā “Child of a Water Carrier”), seized Kabul, the capital city, and declared himself ruler. Amānullāh attempted to regain the throne but, for reasons that are unclear, failed to do so. He abdicated in January 1929 and left Afghanistan for permanent exile that May.

This article was most recently revised and updated by Noah Tesch, Associate Editor.


The Persian lace connection

The earliest published crochet pattern dates back to the early 19th century. Penelope, an early woman&rsquos magazine, included a pattern for a crocheted bag. Published in Amsterdam in 1824, it set my heart racing (I have a bit of a crochet bag obsession). The detailed pattern was seized on by the well-to-do European ladies of the day.

Worked in bands of chain mesh from the top, the bag was given substance by denser bands of slip stitch crochet in between. A star at the base was worked in UK double crochet.

The instructions give a big clue about the ancient history of crochet. The bag is made using a tambour hook, a slim metal shank with a sharp hook fitted into a wooden or bone handle. Crochet history expert, Lis Paludan, is convinced that our modern crochet techniques developed directly from tambour lace making, or tambouring.

Tambour lace was worked using very fine threads of silk, silver and gold, which were hooked in and out of the fibres of a background mesh. The chain stitches used between motifs in the mesh look very like crochet.

To make it easier to work, the fabric mesh was stretched over a circular frame &ndash a non-musical tambourine (&ldquotambour&rdquo is French for drum), but this was a labour-intensive process it must have taken hours to make even a short length of lace to edge something like a hankerchief.

Tambouring became popular in Western Europe from the mid 18th century and illustrations survive from 1700 showing French tambour hooks. I find it quite persuasive that these were called crochets, derived from croc, the French for hook. Its definitely possible that the first steps towards modern crochet were taken when the mesh was abandoned and just hook and thread was used to form chains and more complex motifs.

What about tambouring pre-1700? Possibly, the skills were transported to Europe around that time via the trade routes from India, Turkey and Persia. A form of tambour lace is still made in Kashmir in India, described today as Aari embroidery.


Mohammad Daud Khan

Our editors will review what you’ve submitted and determine whether to revise the article.

Mohammad Daud Khan, (born July 18, 1909, Kabul, Afghanistan—died April 27, 1978, Kabul), Afghan politician who overthrew the monarchy of Mohammad Zahir Shah in 1973 to establish Afghanistan as a republic. He served as the country’s president from 1973 to 1978.

Educated in Kabul and France, Daud Khan, a cousin and brother-in-law of Zahir Shah, pursued a career in the military. He rose to command an army corps in 1939 and held the post of minister of defense from 1946 to 1953. As prime minister (1953–63) he instituted educational and social reforms and implemented a pro-Soviet policy. He was also an advocate of Pashtun irredentism, the creation of a greater “ Pashtunistan” in Pashtun areas of Pakistan and Afghanistan. This caused the relationship between the two countries to deteriorate and eventually led to Daud Khan’s resignation. His overt participation in politics was severely curbed in 1964 when a new constitution barred members of the royal family from holding political office.

On July 17, 1973, Daud Khan led a coup that overthrew Zahir Shah. He declared Afghanistan a republic with himself as president. Once in power, Daud Khan sought to suppress the left and lessen the country’s dependence on the Soviet Union. On April 27, 1978, however, he was killed in a coup that brought to power a communist government under Nur Mohammad Taraki.


How the Balkan Peninsula Came to Be

Geographers and politicians divide the Balkan peninsula in a variety of ways due to a complicated history. The root cause of this is that a number of Balkan countries were once part of the former country of Yugoslavia, which formed at the end of World War II and separated into distinct countries in 1992.

Some Balkan states are also considered "Slavic states" as they are typically defined as Slavic-speaking communities. These include Bosnia and Herzegovina, Bulgaria, Croatia, Kosovo, Macedonia, Montenegro, Serbia, and Slovenia.

Maps of the Balkans often define the countries listed above as Balkan using a combination of geographic, political, social, and cultural factors. Other maps that use a strictly geographical approach include the entire Balkan Peninsula as Balkan. These maps add the mainland of Greece as well as a small portion of Turkey that lies northwest of the Sea of Marmara as Balkan states.


Negotiations with the United States

The Taliban and the United States began meeting in 2018, with the help of Saudi Arabia, Pakistan, and the United Arab Emirates, the only countries to have a diplomatic relationship with both parties. The discussions focused on the withdrawal of U.S. troops from Afghanistan, though the United States hoped to eventually push the Taliban to negotiate with the central government. In July 2019 the discussions included central government officials for the first time, who agreed with representatives of the Taliban on general principles for future reconciliation talks. The Taliban’s representatives were not authorized by the organization to negotiate in an official capacity, but observers considered the meeting a successful icebreaker.

By early September the United States and the Taliban had reportedly come to an agreement in principle and were narrowing in on the details of a signed deal when an attack by the Taliban in Kabul killed a U.S. service member. Days later a secret meeting between top U.S. and Taliban officials was called off by the U.S. the cancellation was blamed on the attack.

A deal was struck in late February 2020. The Taliban agreed to begin talks with the central government within 10 days of signing the agreement and to prevent al-Qaeda and the Islamic State in Iraq and the Levant (ISIL also called Islamic State in Iraq and Syria [ISIS]) from operating in Afghanistan. The United States, for its part, would phase out its troop presence in the country over a 14-month period it began reducing troop levels in March. After a delay caused by the central government’s reluctance to carry out a prisoner swap promised to the Taliban by the United States, negotiations between the Taliban and the central government began on September 12. By April 2021, however, little progress had been made in the negotiations. Nevertheless the United States reiterated its commitment to withdraw its troops, although it delayed its deadline from May to September.

The Editors of Encyclopaedia Britannica This article was most recently revised and updated by Adam Zeidan, Assistant Editor.


Watch the video: ROYAL Family of HUNZA back to their ROOTS in Macedonia