A History of Queen Anne’s Lace

What struck me most forcefully in reading up on the history of contraception and abortion was that, step by step, women have been conditioned to believe that choosing to control their own reproductive process, even to the decision to prevent conception, was at best immoral, or at worst, criminal.

Years ago, I was watching an educational TV show, and the narrator discussed plants that were not native to the Americas but which were now common. As an example, the script mentioned Queen Anne’s Lace, stating that the seeds were carried to the Americas caught in blankets and clothing of the European settlers.

I could not stop laughing. I was well aware that the seeds of Queen Anne’s Lace, taken as a morning-after tea, were the most effective of all the early forms of birth control–at least since silphium was hunted to extinction by Roman and Egyptian women desperate to prevent conception. The seeds of Queen Anne’s Lace weren’t ferried to the Americas accidentally, hitchhiking on property, but quite purposefully, by women who preferred not to be worn out or die due to too-frequent childbearing.

For centuries, knowledgeable midwives instructed the women they served in the lore of birth control—difficult, and not totally reliable, but not completely impossible in the centuries before the development of the diaphragm and the contraceptive pill. And, yes, their knowledge also included methods of abortion, customarily using herbs. Compounded from celery root and seed, hedge hyssop, cotton root, Cretan dittany and spruce hemlock, mistletoe leaves and horseradish, cinchona bark, ashwagandha and saffron, wooly ragwort, castor oil, blue and black cohosh, evening primrose, and even the remarkably dangerous pennyroyal and tansy and ergot of rye, herbal abortions were common when contraception failed.  The concoctions were so prevalent that ads for patent medicines to cure “delayed menstruation” were common in women’s magazines throughout the 1800s—that is, until the passage of the Comstock Act in 1873 criminalized even the possession of information on birth control.

The world has turned many times since the Comstock Act, through the invention of the contraceptive pill, to the self-help clinics of the late 1960s that instructed women in the practice of menstrual extraction, through Roe vs. Wade. The morning-after pill was introduced, a chemical solution at last replacing that centuries-old use of abortifacient herbs.

I absolutely do not, will not, debate the wrongness or rightness of any of this, from Queen Anne’s Lace to the present day. To me, decisions regarding birth control and abortion remain always a choice best made by the woman involved, in accordance with her conscience and personal situation. But what struck me most forcefully in reading up on the history of contraception and abortion was that, step by step, women have been conditioned to believe that choosing to control their own reproductive process, even to the decision to prevent conception, was at best immoral, or at worst, criminal.

We think of the Middle Ages as a time of great ignorance, yet it was then that midwives—wisewomen–practiced, sharing their expertise and knowledge with the female population at large, easing the pain of childbirth and preventing many maternal deaths by their skill. And it was then, too, that such women were hunted down, burned and tortured and hung as witches, effectively silencing their knowledge for generations. Women were left in the hands of male doctors who, shrugging, pronounced, “Maternity is eternity”, reconciling countless numbers of women and infants to death as babies were delivered in filthy conditions with unwashed hands.

Circle the world a few times on its axis, and enter the 1900s, when horrific deaths by botched back alley abortions were common. Young and desperate women bled to death or died horribly of septicemia. Circle again, and information on contraception was readily available, along with new forms of birth control. Contraceptive creams and condoms were sold over the counter. Legal abortion gave a measure of safety to the procedure. The morning after pill became available for those who had either been careless or experienced the horrors of rape.

History, they say, always repeats itself. And so as society swings perilously close once more to the era of illegal and back alley abortions, so it may also oscillate to women who reclaim the ancient knowledge that gave them power over their own reproductive processes: to the natural methods that provided women a way to make their decisions in accordance with their conscience.

The morality of these decisions is not truly the question, for no matter what is legislated, women will continue to fight for and gain absolute control over their own bodies.  They will continue to make their personal choices regarding reproduction. The Pendulum of Queen Anne’s Lace, you might call it. History will, genuinely, always repeat itself.

Prom Night, Then and Now

Because the month of May is when so many high schools hold their proms, an acquaintance asked me to re-publish this post from last year.

As women will do when gathered together day after day, when I worked in an office, we often found time to switch into “chat and gossip” mode. On one particular day in my memory, I recall that a supervisor had proudly displayed to a group of us ladies the prom photos taken of his oldest daughter. That sparked a discussion of school dances in general, and prom gowns specifically.

Each of the women present took turns describing her beloved senior or junior prom gowns and favorite dance dresses. I stayed on the periphery of this conversation, volunteering nothing, and fortunately each of the women was too wrapped up in fond memories of her own Cinderella moments to note my reticence.  My relief was enormous; I didn’t know what I would have said if they had turned to ask me about my dance dresses.  Made something up, perhaps – probably – because admitting the truth would have been humiliating: that I had never had a prom gown, nor even a dance dress.  I never wore one because I never went to a dance or a prom.  I did not go because I was not asked.  Without a date, a young woman of my generation didn’t have the opportunity to attend her own school prom.  She did not dare walk alone through the door onto the dance floor.

All of the women involved in the conversation that day were fifteen to twenty years younger than I. I knew that they could not possibly understand.  Contemporary young women would likely reel in disbelief and shock if faced with the restrictions we girls lived under in the late 1960s and early 70s.  If one did not have a date for a dance or a prom, one simply didn’t get to attend.  I seriously doubt that a single girl would have been sold a ticket for her own prom—or, if she had somehow wrangled a ticket, would not have been allowed to walk in alone. We, the overflow of plain young women without boyfriends or dates, simply bowed to the reality of the situation: we would not be asked, we would not attend. If we chafed under the restrictions, we were told that there was absolutely no point in railing against the situation.  It was just the things way were.

But somehow, at some point, it stopped being the way things were. The daughters of  “women’s libbers” and “hippies”, imbued with a sense of combativeness and personal worth that had been sadly absent in earlier generations, struck out on their own and refused to be tied to some male just in order to gain admission to their own school dances.  Happily single, they demanded tickets.  They bought their own corsages, slipped on their lovely gowns, tucked their feet into brand-new dancing shoes, and off they went.  Even if asked by a boyfriend to be their prom date, these brave young innovators sometimes refused to be coupled to one person and instead attended in groups of girlfriends, free to dance (or not) with whomever they pleased.

I not only admired those young women, but I was fiercely glad for them.

When my daughter and I went to a showing of the Disney movie Cinderella, I found myself biting my lip and blinking hard against tears when the title character is barred by her stepmother and sisters from attending the ball.  Later, as we left the theatre, I told my daughter, “That’s what it felt like, on the night of my senior prom.  That’s how I felt.”  Her own eyes sought mine in compassion and she squeezed my hand.

There were no fairy godmothers for the Cinderellas of my generation. And I had not the needed courage, perhaps, to change the sad state of my own affairs. But I have nothing but admiration for contemporary young women who neither need nor want fairy godmothers, nor pumpkin coaches, nor glass slippers—who reach out with no magic wands but that of their own self-assuredness and hard work to create the lives they want. And I hope every one of them dances, like the twelve dancing princesses of another fairy tale, long past midnight and until their shoes are worn through.

My Daughter Speaks on Motherhood

Asked by her workplace to write a piece about balancing motherhood and one’s working life, my daughter wrote this moving, funny essay.

On August 23, 2018, I became a mother for the first time, to a beautiful, adventurous baby girl. Getting her here was no easy feat!–but I’m sure most mothers can say this to some degree about their pregnancies and/or giving birth. My personal story, though, is that I had to be scheduled for a labor induction because my blood pressure was getting too high (which was understandable, considering that I was the size of the Goodyear Blimp in the middle of a burning hot Indiana August). So,  one evening my husband, mother, and I went to the hospital to prepare to bring our little “bun” into the world.

Twenty-four hours of labor later, nothing had happened except for several frightening moments as both my own and my baby’s blood pressure and heart rate bottomed out (and zero luck with getting any sleep!)  My OB/GYN (whom my husband refers to as “Dr. Sexy” because, in all honesty, the man really could have been on Grey’s Anatomy!) discussed our options with us. Option 1: Keep waiting and see what happens; Option 2: Stop the induction, let me have a meal, and start the induction again tomorrow (they almost had me on the whole “have a meal” thing!); or, Option 3: Get this show on the road and have a C-section. We went with Option 3. Already on an epidural anesthetic, I was dosed with more and wheeled into the operating room.

My C-section experience was ultimately unremarkable except for being able to feel them cut into me just before they pulled out our little bundle of joy. I do not jest! It really felt like I was in the movie Saw or something! But they snapped a photo of this perfect child and held it in front of my face, knocked me out completely and sewed me up, and I woke up just a little bit later to my beautiful little mini-me.

Adoring MomOnce we were home, I was lucky enough to have my wonderful husband home with us for a month as we got into our new routine as parents. But eventually, my man had to go back to work and it was just me and this tiny little human being. Things didn’t exactly go as planned (when do they ever?), but ultimately, we got through my eight weeks of maternity leave. However, I experienced a whirlwind of postpartum depression, with crying and anxiety spells every day. I informed Dr. Sexy of my problems, and was matched up with an amazing therapist whom I still see to this day, eight-plus months later.

When I was asked by my office to write a blog post for our website about what it’s like to be a new mother and to balance motherhood and work life, I hesitated for perhaps all of 30 seconds before I signed on. I decided to be entirely honest about my experiences.

New motherhood has been hard and intimidating because it brought to light all my own personal “stuff” that I need to work on, as well as a general “What the heck am I doing?!” feeling that I’m certain will never go away. But it’s also been such an amazing, fun, happy, “There’s not enough adjectives to describe it” experience!

When I had to return to work two weeks before Halloween, I planned our routine and our route to the babysitter’s as well as I could, got there early–and then cried in my car for 20 minutes. Each day I felt both excited to be back to helping adults (adults!) with their concerns, as well as sad. At times it was unbearably heart-wrenching to leave my baby girl behind. But I am incredibly fortunate, for I leave her with family each working day, where she is cuddled and loved every moment.

So, to answer the question of how to balance work and family life: I don’t really have a profound answer to give you. All I can tell you is that it gets a little easier each day, even if certain hours are incredibly hard. And that it is so important to practice good self-care. As I said, I still meet with my therapist weekly to work on personal stuff that I want to have a handle on as I help shape my little girl into the woman she will become. I also laugh with my family and friends, have date nights with my husband, and try to acknowledge that it’s okay that I was, and am still, a complex human woman who wants to be present for everything in my life.

So, take it one day, one hour, one moment at a time. Talk to the people who love you. Ask for help. All the “Mommy Club” are here cheering you on! Happy Mother’s Day everyone!

How Ego Became a Dirty Word

Kept in check, regularly examined through conscience, and recognized as a personal identity having nothing to do with one’s possessions or achievements, the ego is a marvelous thing…

When did “ego” become a dirty word?

To the best of my understanding, in its original concept, ego meant simply that part of human consciousness which indicates “I”. It was understood to be the ability to distinguish one’s self from others; the awareness that comprehends personal experience. Over time, that original concept enlarged to include egotism—that is, conceit, vanity, or an inflated sense of self-importance. But, at its inception, the idea of the ego was simply that of self-awareness, and of personal identity–an ability which small children begin to develop at about age two.

Yet, somehow that harmless perception of a consciousness which distinguishes the self from others has mutated into a appalling concept; shameful at best, destructive at worst.

While I cannot lay all the blame for this divisive idea on a slew of philosophical books of recent vintage, I do believe they are responsible for perpetuating the concept that there is something inherently reprehensible about the normal human ego. Frankly, that makes little sense to me. Without a sense of separateness, of individuality, we cannot function in the world.

A healthy ego protects. It tells me unequivocally that, no matter what some nitwit says of me, I can make the decision to not believe their words. A well-regulated ego says to one, “Just because I am requested, ordered, to do this by a superior, I need not necessarily do it. I am an individual. I can make my own decisions regarding the rightness or wrongness of the order.” A wholesome, balanced ego is a shield against poor decision and immorality.

Nor, despite the best arguments pondered by those who despise the term, is a normal ego an obstruction to empathy. To the contrary, knowing that I have endured a difficult, painful or troubling experience allows me to look with compassion on others who are undergoing something similar. The “I” that identifies as an separate entity recognizes and therefore empathizes with all the other “I” individuals who are enduring anguish.

Sadly, the concept of a healthy, balanced ego has somehow become almost inextricably confused with egotism. But the two are not the same. “Nothing in excess”, the Greeks are reputed to have carved on the temple of Apollo at Delphi, and the advice is as apt now as it was those thousands of years ago. An overweening or inflated ego is an excess. It is narcissism, selfishness, and self-absorption. It is a bane and antipathy to sympathy and concern. It does, as those self-same philosophical books decry, express itself in the attachment to things; it warps the personal identity into a mere exponent of possessions or achievements.

Those who lambast the idea of a personal ego seem to maintain the position that our very separateness also separates—separates us from each other, and from the divine within both ourselves and others. Again, that concept makes little sense to me. If I am I, then I am the Divine expressing as this wondrous, personal, individual being: myself. I am a perfect creation from the hand and mind of the Creator. To be in any way separate from my divine self is simply not possible; to think so is total hubris.

And if I recognize that divine and spiritual center within myself, then I must recognize it in all others, who are all also perfect creations of a perfect Creator.

I believe we came, were sent, into this world to experience life as individuals. In doing so, I recall that, in some versions of the myth of Hercules, Zeus desired to know what it was to live as a mortal. And so he fathered a son, Hercules, who would be both god and human. As the creator, Zeus was inextricably interwoven to everything; he was all he had created. But, through his son, he could comprehend what it was to be separate and apart from all he had created; to live as an individual; to be mortal.

Kept in check, regularly examined through conscience, and recognized as a personal identity having nothing to do with one’s possessions or achievements, the ego is a marvelous thing, leading us through a lifetime of personal awareness in conjunction with our spiritual core. Far from being undesirable, it is yet another impeccable creation bequeathed us by our Creator.

I Actually LIKE Iceberg Lettuce!

I realize this proves that I have absolutely no palate… 

Shameful as it is to confess in a world of gourmet food and connoisseurs of all the best taste has to offer: I really, really like iceberg lettuce.

I realize this is an extremely unpopular point of view. It makes me appear unsophisticated, unrefined, crude. It proves that I have absolutely no palate (well, just the sort of wines I prefer prove that, in any case.) But there you have it. I like iceberg lettuce. I prefer it to many other types of greens.

This isn’t to say that I don’t enjoy other forms of salad greens. I love spinach leaves, so deeply green and silky. Radicchio, bibb, butter, romaine – toss ‘em on in, although I draw the line at bitter endive. But my favorite salad will always be based on a bed of iceberg lettuce.

I grew up in the traditional Midwestern fashion of many decades past, eating iceberg lettuce in all my salads and on my sandwiches; actually not even knowing, until I was probably about age 13, that there were any other types of lettuce leaves. And unlike a lot of other foods served me as a child (I still shudder at the sight of a Brussels sprout), I enjoyed iceberg lettuce. I still do. To my tongue, iceberg lettuce is the perfect crisp. Unlike radicchio or romaine or (again, shudder) endive, it has no bitter aftertaste. It crunches. It tastes green.

Mix iceberg in with shreds of red cabbage, small red radish roses, slivers of carrot, a bit of thinly-sliced celery, perhaps even some water chestnuts, and the aforementioned emerald green of spinach leaves, and yes, a few other types of lettuce just for the variety they add, and that, to my taste buds, is a perfect salad. I happily toss in sun-dried tomatoes, olives green and black and kalamata, raw broccoli and cauliflower flowerets, crumbles of feta cheese and croutons, while the dressing can be almost anything: my favorite Greek or balsamic vinaigrette, Caesar or even the delicious bleu cheese which will, unfortunately, break me out in hives and result in a quick trip to the medicine cabinet for a dose of Benadryl…   But all that matters less, though, than the welcome crunch and intense greeny-ness of my beloved iceberg lettuce.

Food sophisticates may well cringe and pronounce me to be a complete rube. It doesn’t matter. I will always prefer my childhood favorite: iceberg lettuce.

My Be-Attitude

When I am doing housework, I usually wear my glasses, not my contacts. This is a self-defense measure: I’m a lot less likely to end up with stirred-up dust or other particles irritating my eyes if I’m wearing eyeglasses.

However, due to those very eyeglasses, for a number of years I found myself regularly fussing—essentially, throwing a mini-tantrum—each time I opened the dishwasher. This despite the fact that I rarely run the dishwasher more than once weekly, since, living alone, it takes me days to fill it. But it’s also my habit to open the dishwasher the very minute it stops running, in order to check that none of the dishes (especially the small bowls I used for serving canned cat food to my pets, or the concave bottoms of some of my cups) have been positioned so that they are holding water.  I know from sad experience that the drying cycle won’t remove water from a pet food bowl that’s flipped upright during the washing.

Unfortunately, opening the dishwasher at this point sends clouds of steam rising. And that, inevitably, means that my eyeglasses completely fog up, making vision impossible.  I couldn’t see a water-filled bowl unless it jumped up and slapped me in the face.

And so, for perhaps three years, I struggled to remember to pull my glasses off my face before I opened that dishwasher door.  Struggled, and inevitably forgot, resulting in stream of (Bad Word Deleted) language, followed by roughly yanking the glasses from my eyes and scrabbling for a tissue to wipe them.

As I say, this unfortunate behavior continued for almost three years, before one day it occurred to me that, after encountering the rising steam and being thoroughly wiped, my eyeglasses were much cleaner–the lenses, of course, but I also wiped the hot steam from the frames and earpieces, cleansing them, as well. And with this realization was coupled the sudden understanding that my repeated irritation was totally unnecessary.  In fact, it was contrary to good sense.

The following week when I opened the steaming dishwasher, I was prepared. I took off my eyeglasses and carefully held them into the rising steam, making sure that it coated and heated every part of the frame and lenses.  Then I carefully and slowly polished them stem to stern before placing the glasses back on my face.  By that time, the dishwasher had stopped emitting steam, and I could see and empty any dishes which were holding water before closing the door and allowing the drying cycle to run.

Instead of a rumpled spirit, I had sparkling clean eyeglasses. Instead of fussing and irritation, I was relaxed.

And all it took was a change of attitude and perspective.

It’s strange, sometimes, the small and mundane ways that major lessons arrive in this life. Something as simple as opening a hot dishwasher door can inform us of just how often we view things askew, making our lives much more difficult and uncomfortable than they need to be.

I sometimes now stop, when I am irritated beyond measure by some minor event, and attempt to apply the lesson I learned from my steamed-up eyeglasses and the dishwasher door. And instead of steaming up within my spirit, I often find a way through to peace and courtesy and calm.

It might not be on par with sitting on that hillside listening to a master teacher speak the beatitudes, but I’ll take my lessons where I can find them. I am teachable; I can learn to be the master of my own attitude.

 

 

A Plague of Kittens

I just read another of those articles explaining that an unspayed feline can produce (blah-blah-blah) kittens in (blah-blah) generations, and I had to laugh.

Years ago, when TNR programs were non-existent, I casually fed a colony of feral cats on my doorstep, giving them kitchen scraps and the food left over from my indoor pets. And, yes, they produced massive amounts of kittens. But here is the salient fact: those kittens did not live. Over all the years—about a decade—that I (and countless mice, moles and birds) provided nourishment for those stray animals, only one of them, the colony matriarch, lived. A lovely little calico who resisted being brought indoors, despite my best efforts to provide her a home, CallieCoCo produced an endless stream of both her own kittens and daughters who provided more youngsters for the clowder. Usually born at the start of each spring, and despite having a steady source of food outside of their own hunting, each year by the autumn no more than one or two of the youngsters survived. They fell victim to cars, to hawks and owls, to illness, and (horribly) to the neighborhood’s future serial killers practicing their skills. And those who survived the summer usually perished in the winter.

Finally, when the venerable matriarch herself passed, the clowder died off within a few months. Without her guidance, the colony could not survive. To the dismay of the local homeowners, the moles and mice returned to the area in droves. But the predicted plague of kittens never happened.

I had much the same Spockian “Where’s the logic?” reaction another time, in the 1980s when AIDS was at its height worldwide, as I read about the poorest regions of India. On two different pages of our local Sunday paper, two separate articles had been printed: one discussing the high birth rate of the poor throughout India (at the time, a totally destitute country, with years yet to come before technology brought pockets of prosperity) and the shocking implications for overpopulation; the other just as earnestly delineating the horrific ravages of AIDS upon the area, and the resulting gruesomely high death rate among the neediest population. Now look, I thought, switching back and forth between the pages of the newspaper, comparing the two articles, you can’t have it both ways. Either the poor will reproduce without limit, until the population is stacked up like cordwood—or they will die off in an uncountable numbers as a result of AIDS. Each a dreadful and agonizing possibility, I thought, but one or the other; you can’t have it both ways.

Or, more likely, the high death rate from the plague of AIDS among the poor would counter the exceptionally high birth rate, balancing the two—because that is the way that Nature has, cruelly but effectively, kept things in check for uncounted millennia. High population—enter bubonic plague. High population—enter typhus and typhoid, war, natural disaster, famine, pneumonic plague, anthrax, ebola.

I sometimes try apply this logical thought process to the science of global warming. Don’t misunderstand me: I absolutely do believe that the human race has added disproportionately and frighteningly to the earth’s overall temperature, and, if unchecked, will continue to do so, with unspeakable results. But I also know that we have measured the earth’s temperature for only a few hundred years out of countless millennia, and that there have been cycles of warmth/coolness throughout; i.e., the mini-Ice Age of medieval times.  I know that it is these cycles to which those who resist a belief in global warming refer.  Then, however–logically–I remind myself that all of these previous cycles were the result of natural phenomena, such as volcanic eruptions and comet strikes, or perhaps even dinosaur farts, and that those cycles did totally destroy existing fauna and flora, completely revamping the face of the Earth.  I wonder then—logically—how much we really know about the earth’s temperature cycles, and the damage we are doing, have already done, to the ecosystem…if it can even be corrected, or if we have pushed matters so far that we now must let the chips fall where they may and, like the dinosaurs,  watch as the very environment that once nurtured our evolution perishes, and us along with it. And, in terror, I very much fear that this latter scenario is true.

I think back to the vision of myself, watching playful kittens who never quite managed to survive, let alone overrun the neighborhood—switching back and forth between the pages of a newspaper with two contradictory articles—sitting through school lessons, learning about both the sweltering heat of Mesozoic mornings and the vast fields of ice that once lay across the Great Plains…and I wonder, really wonder, how much we, sure and certain in our superiority and our reliance upon self-proclaimed “leaders” who really need to pull their heads out of their behinds—well, I wonder how much we actually know of all that we think we know.

Homicide Is Not Pretty…or Hot!

My usual choice of escapist literature is the “cozy” mystery genre. These lightweight novels are relaxing, predictable, sometimes hilarious, often a tad silly, but rarely gory and usually lacking in nerve-wracking chills. “Thriller” is not, to me, a leisure pastime; I like to be able to turn out the light comfortably after reading in bed at night! But I enjoy these frivolous mysteries, which are interspersed with quirky characters and abound with loveable pets, and in which, as a usual plot line, only the characters one really doesn’t like bite the dust.

However, I may be reaching the end of my tether with my favorite genre. Since I review every book that I read—and that is a LOT of books—I found myself the other day beginning a review with the telling sentence, “I had second thoughts even as I downloaded this book: Did I really want to read yet one more ‘bakery’ mystery?!”

I blame the Sex and the City cupcake craze for the plethora of bake shop mysteries. The bakery mysteries have multiplied like Star Trek tribbles, and a great many of them are pretty pallid, with plots so similar they might have been created by algorithms rather than writers. Almost inevitably, the grand opening of the latest bakery will be blighted by the death of a first customer, with the baker/owner herself the main suspect. Of course, she will have to begin sleuthing out the real murderer, finding clues to which the police (who often seem to be drawn from a Laurel and Hardy movie) are oblivious. Meanwhile, our plucky heroine is never, ever arrested for interference in a police investigation—a fate which she richly deserves.

Now, to my way of thinking, Kerry Greenwood’s most excellent Corinna Chapman bakery mysteries (well pre-dating the slew of copycats which followed) sewed up the genre front, back and center. Beautifully written, excellently plotted, with three-dimensional characters and incredible detail, they are simply a delight to read. But those are not the only reasons for which I prefer them. I like Ms. Greenwood’s books best because nowhere, nowhere at all in their pages, does any character appear who might be even faintly considered a “hot hunky homicide detective”.  Yes, she does include an attractive PI–but never a hot homicide cop.  In fact, some of her police force characters are (gasp!) female.

But to judge by most of the other cozies (which I still enjoy, despite their flaws), every homicide detective in every rinky-dink precinct in every city of every state within the entire nation (every nation, worlwide!), is so attractive, chiseled, gorgeous, hunky and incredibly hot as to put most A-list Hollywood actors to shame. There is not a dud in the bunch. Nowhere in these many pages do we find a homicide detective (other than as a partner to the REAL detective) who sports a donut paunch and a balding pate; nor, heaven forfend, a female homicide detective, except as junior (very junior) partner to the hot honcho. Nope. If the cozy mysteries are to be believed, every desirable man on the face of the planet has chosen “homicide detective” as his career path. And he will, of course, fall like a rock down a cliff for the leading lady.

For this ridiculous notion, I must, sadly, hold the marvelous Janet Evanovich responsible.  Make no mistake: I absolutely adore Ms. Evanovich’s formulaic novels. I’ve read every one of them with utter delight—most of them several times each. They are the greatest escapism novels ever written. They are laugh-out-loud funny. They are just plain great fun, even for male readers.

But I cannot deny that it is likely upon Ms. Evanovich’s shoulders which rests the onus for the creation of the “hunky homicide detective” mythos. I sigh over this, even as I acknowledge that it isn’t her fault that every aspiring and seasoned mystery writer took her idea and ran with it right out the door and across the meadow to the romantic sunset beach. Still, I blanch at the thought of reading yet one more lighthearted mystery featuring the same, tired old “hot homicide detective” plot device.

I will almost certainly go on reading my favorite cozy mysteries. Despite their many failings, I find the books both relaxing and entertaining. But wish—oh, how I do wish!—that their authors would learn to show a tad more creativity and diversity when creating their leading men.

They Have Ruined Oatmeal For Me!

The Ubiquitous They have ruined oatmeal for me.

You know the ones I mean: the “They” who inform our daily lives, inducing fear, spreading urban myths, dispersing vague and often erroneous information. I have always envisioned them as something resembling the giant ants in the old sci-fi movie, Them. “They say that….”   Who? Who says that? Rarely is the “they” who are saying these things defined. But we all know They.   We all repeat their information – or misinformation.

And They have ruined oatmeal for me.

Make no mistake: I love oatmeal. I have since childhood. It was rarely on our table, since in the 1960s most mothers preferred to hand out boxed cereal and milk to their children rather than to cook breakfast. But as a grown woman with my own apartment, I indulged my love of oatmeal – indeed, of all hot cereals. Cream of Wheat. Coco Wheats. Rolled Whole Wheat. And oatmeal. Real oatmeal – not that wimpy instant stuff. Old fashioned oats, which took longer to cook and were rich with texture and flavor. Rarely did I add in anything except a handful of raisins and some raw sugar. And even on those days when I caved to a time crunch, I could satisfy my longing with a delicious but now-discontinued cold cereal called Post Oat Flakes.

I was never quite able to convince myself that Cream of Wheat, laden with a big pat of butter and sugar, was really good for me, but I enjoyed it nonetheless; ditto, Coco Wheats. I still remember fondly a winter morning before school when I had spent the night at a friend’s home; her mother believed in a hot breakfast on winter mornings, and so I sat down to a bowl of piping hot Cream of Wheat with a pat of butter still warmly melting on its surface. Rolled whole wheat cereal, harder to find but prepared laden with honey, delighted me, and I could at least tell myself it was a whole grain. But oatmeal – oatmeal was GOOD for me, and I loved it. As I grew to adulthood, I rarely had time for it except on weekends until the addition of microwaves as standard office equipment meant that I could have my oatmeal for breakfast constantly. A recipe for Scottish oatcake was so delicious that I swore to indulge on it only a few times a year.  Exercising  restraint, I permitted myself to bake oatmeal raisin cookies only at the holidays.  I rejoiced when oats were declared, “heart healthy”.

Then my little world of hot cereal began to collapse like a deflating balloon. The word that poisoned my world was glyphosate.

Glyphosate,  the broad-spectrum herbicide used on genetically modified crops.  Glyphosate, determined by multiple jury trials to be responsible for causing cancer for those who used it regularly.  Glyphosate, infesting soil, water, animals, and crops;  occasionally mentioned by a few experts as a potential factor in the declining honeybee and butterfly populations, just as DDT had done to an earlier generation. Glyphosate, sprayed (They said) on harvested grain to dry it for storage. Glyphosate, ruining my Cream of Wheat, my Coco Wheats, my rolled whole wheat cereal. Glyphosate, infesting my healthy, hot, delicious oatmeal.

I continued to eat oatmeal even after first hearing about the glyphosate contamination of oats—even contaminating organic oats, due to the spray drifting from treated fields over  nearby organically grown and dried crops. The Ubiquitous They, I reasoned, might be wrong, after all. They might be repeating yet another urban legend.

But They weren’t. Lawsuits entered the courts, claiming glyphosate contamination in both hot and cold oat cereals, regular and organic. The company responsible for marketing the deadly weed killer was ordered to pay an incalculable sum to a groundskeeper who used the preparation regularly and contracted lymphoma.

Sadly, even after switching to an organic brand and praying it might not be contaminated, I find that I can no longer enjoy my healthy, delicious hot bowl of oatmeal. I can no longer bake and eat my favorite oatmeal raisin cookies, even at the holidays. I’ve entirely stopped baking my beloved Scottish oatcake.

I suppose it wasn’t really They who ruined oatmeal for me, but corporate lies and greed and misinformation, coupled with ecological apathy and insouciance.

But if I ever encounter giant, mutant ants, I’ll send them to the They who ruined oatmeal for me.

Spirituality is Big Business

When I was in my early twenties, I picked up a slim paperbound booklet that discussed a technique called Treasure Mapping. I think I paid about $1.50 for it. (I was not very affluent in those days, so I certainly couldn’t have paid much more.) The technique illustrated in the booklet would today be understood as making a vision board, and I found it fascinating. “Pictured Prayer”, the booklet explained, was simple and produced excellent results.

I gathered together the necessary accessories, all of them easy to obtain and inexpensive: photos clipped from magazines, glue, pens, construction paper — and created my first vision board. I’ve used the technique many times in the intervening years, often with surprising success. I have sometimes come across my old, discarded vision boards and realized with satisfaction that nearly everything I pictured on them had come to pass.

But recently I saw an announcement for a class in vision board making. The cost for the two-hour course, which included all materials, was $150.00.  I thought back to my $1.50 booklet, and the years of photos clipped from magazines or downloaded on the computer, the poster boards, glue sticks, glitter, stickers, or occasional scrapbooking supplies – and realized that I probably hadn’t spent $150 on all my Treasure Maps in the 40 intervening years.

In that distant era, even as I learned vision boarding, I learned to meditate by selecting library books to read about meditation techniques, listening to tapes borrowed at the library, and asking advice of those who meditated regularly. After hours of dedicated practice I found the method that seemed best for me and made meditation a lifelong practice. Today, though, I could chose to spend anywhere from $10 weekly for an hour’s guided meditation at a local new age shop, or up to $60 for an on-line course complete with an instruction manual, interactive forums, and (this one still puzzles me) a certificate of completion. I could purportedly learn the hands-on energy healing system of Reiki entirely on-line, without ever setting foot in a master teacher’s office. I could pay $10,000 for a spiritual retreat with a self-professed guru. I could complete an on-line course to become a “spiritual master” in any one of a half-dozen different disciplines – and, having completed the course, be surprised with the information that there is yet another, higher level available that could not be revealed to a mere novice, but only to a seasoned acolyte. And, of course, that newly-revealed level could be mine for only an additional $59.95!

Americans, it seems, do not believe that anything, even spirituality, has value unless it is paid for – by cash, check or charge, rather than blood, sweat and tears. You need not put real effort into learning as long as you are willing to sit at the feet of a “master” and fork over money – and plenty of it.

This isn’t to say that there aren’t legitimate costs connected with teaching classes or holding retreats! Retreat attendees have to be fed and housed, and the teacher’s time has to be compensated. A class venue doubtless has costs attached – rent to be paid, utilities to be provided, class materials to be printed. But the hubris of charging $150 for an hour spent “instructing” students how to paste pictures on poster board, or to chant, hold crystals, or meditate, veers (at least to my way of thinking) about 180 degrees north of genuine spirituality.

Once the provenance of moguls of big religion, spirituality, too, has become big business, and a lucrative business, at that. Native American spirituality is taught by those who have not one iota of genetic material from the original inhabitants of North America, and their students pay the sun, moon and stars for the privilege. Instructors with no passion except that for feathering their nests promise to incite a passion for life in their unwitting students, and coin money as they do so.

My personal advice to anyone seeking a spiritual teacher is simply this: remember, first, that you are your own best teacher. There has never been a better or easier time for self-learning. Explore cautiously, keeping both an open mind and a weather eye, but explore. Read, watch videos, learn, practice. And if you find you need assistance to progress on your chosen path, or feel ready for that retreat, or believe a class with others might help – do your homework. Seek out a teacher who is validly a master teacher in her or his discipline, who is passionate about passing knowledge on to others, and who, mostly importantly, lives in such as manner as to demonstrate the value of the subject in which they will instruct you. If a cost is associated with the instruction, investigate what the payment covers, and decide if it seems reasonable, reimbursing the instructor’s costs and time and other essentials, or keeping a center in the black, but not intentionally generating massive profit.  And only then decide if the price is genuinely worth paying, or if you can find methods less financial and more truly spiritual to gain instruction in your chosen discipline.

But the finest spiritual instructor will always be the one you find in two places: your own mind, and your own heart.

And if all else fails, you can always make a Treasure Map.