The thin ice of censorship

Jeremy Littau
Jeremy Littau

Journalist, professor, and internet culture expert Jeremy Littau offered a Twitter rant last week in the wake of the New Zealand shooter posting a live video on Facebook. I’ve put together his posts, so that we can all study this in its entirety.

So this is probably going to get me in a lot of trouble with the free speech wing of my corner of academia, but …. it’s time we have a serious conversation about what social media regulation looks like. The past two years I find myself favoring it more than I used to.

Early in my academic work I was drawn to the power of self‐publishing. In many ways I still am. It’s important to remember that giving people a platform with a microphone gives ordinary people a type of power they don’t have in a world of big media. I still believe in that.

But those were ideals at early stages. Scale reveals flaws and problems of design. And some of these mass shootings the past few years have revealed some serious holes in the free‐speech argument.

To recap, we have a live video here, spread by social networks, and a manifesto that is imminently linkable. The flow from self‐publishing to attention to news brings attention to the source material. The goal for the terrorist here is publicity, and he’s getting it.

News gonna news. Its job is to cover things. But the system of news production we have is based on a pre‐internet world where there are no links to the source material. You got everything through the journalist gatekeeper, filtered in a world without hyperlinks.

But in a social‐internet world, news is publicity. It’s an invitation to google or search social media for the video or manifesto. Not everyone will do it, but many will. @zephoria has talked about this extensively, that it is an unwitting engine for recruitment of radicals.

Unleashing social live video is akin to experimenting on ourselves. Like many things that were done by the internet’s early builders, we did it because we could. And so we did. Clearly there are a lot of good things being done with live video, but at what cost?

I don’t even know how to weigh this, so don’t ask me how. I just know my kids are growing up in a world where they’re going to hear about and be able to search (or be unwilling viewers of) live shooting videos because our tech simply can’t keep up with extremists.

This is too fast for any system, human or otherwise. There’s only one choke point we can reasonably put in the system, and that’s to deplatform extremists. But not after they spew their filth, but before it. I can hear you “what about“ing from here.

The example I keep hearing is whether we want Facebook to decide. I think that’s the wrong framing. We are currently letting Facebook’s audience decide and they’re the ones spreading it. The enemy isn’t Facebook; it’s us.

If your car has faulty brakes, you recall the car, fix it, and put it out on the road. Live video is a broken system and enables some of the worst abuses on social. Put this thing back in the garage until we can design a system that doesn’t tear at our social fabric.

I am not an expert on the tech/code side of this. My sense, though, is we’re using things like algorithms and machine learning to deal with abuse post hoc. Social science can help here. We know abusers have certain patterns. We can stop the horse before it leaves the barn.

I can hear your objections: it sounds Orwellian. But we we are not designing solutions to meet a serious threat. We can’t have a social media shooter making their sick version of a live documentary every month and expect it to not damage society in profound ways.

MASS SHOOTINGS HAVE A WAY of making the theoretical talk about Facebook, Twitter, and YouTube and their role in spreading hate all too real.

Mr. Littau makes several good points here, and when Jeremy speaks, we of his tribe tend to listen. I think his fear is well‐stated, and I agree we need to have a discussion about this. BUT. The post‐modern era is decidedly horizontal, and censorship of any form requires a hierarchy administering what’s acceptable and what’s not. Frankly, that scares me a whole lot more than innocent eyes seeing man’s inhumanity to man.

Today’s news is the newsgathering process made public, and as those of us who’ve been around for awhile know that the process can be very ugly and messy at times. We’ve decided as a people that this freedom to make up our own minds in that process is the pinnacle of the freedom offered by the First Amendment, because WE ARE THE MEDIA today.

It took us a long time to get here, and we should not give it up easily.

When Journalists Accept Confusion

As regular readers here know, I have Palestinian in‐laws and grandchildren, for my oldest daughter is Muslim (of which, I highly approve) and is married to a man who was born in Palestine but was forced to relocate to Jordan in the wake of the six‐day “war” in his homeland. This has forced me to do my own study of the history of the conflict in the region, because my window to the world is likely quite different than yours. I’ve nothing to “sell” in this regard; I’m simply being the journalist I was trained to be and practiced for 45‐plus years in the industry.

Image result for activist Alison Weir
Alison Weir

In viewing videos from California activist Alison Weir (and Executive Director of If Americans Knew) via YouTube, I’ve found a kindred spirit whom I wasn’t aware existed until now, thanks to my son‐in‐law. And, her explanation of the ignorance she once knew is very similar to my own. The timeframe for this quote is the mid‐2000s:

“Five years ago, I guess it was, I knew almost nothing about Israel and Palestine. I skimmed the headlines on the topic. I accepted the confusion of what I read, and like most people, I just moved on. It seemed distant and really irrelevant to my daily life.”

After seeing images of children throwing rocks against Israeli tanks during the second infatata, Weir began to take it seriously and wonder what was really going on. Her research as a journalist lifted the veil of ignorance and opened her eyes to the truth, that American media provides only a HIGHLY propagandized — and therefore one‐sided — version of reality in the Middle East.

I’ve had the same revelation, and I’ve come to believe that this is available to anyone who searches for it. It begins with this statement by Ms. Weir:

“I accepted the confusion of what I read.”

This is a remarkable admission for any journalist. What is it about confusion that favors our just giving up on it? Accepting confusion is a terrible habit, especially if that confusion is fed by somebody’s lies, but if I’m to be truthful, I must admit to the same acceptance prior to 2006. That’s when I visited my daughter’s family in Amman, Jordan, where my confusion was multiplied by stories of oppression and violence by the Israelis that bordered on the unbelievable. No wonder I was confused. Among these seemingly preposterous exclamations was the story of armed Israeli settlers who roamed the roads in the West Bank in automobiles, shooting and killing Palestinians at will. I simply couldn’t bring myself to accept what I was being told.

Confusion, it seems, is a balm given to those who look the other way in the face of evidence to the contrary. I’d rather be confused than accept that reality is really quite simple. I need it to be confusing, because I need to embrace Israel as the birthplace of my faith. Poor, innocent, lovable Israel.

After I returned stateside, I began investigating the particular claim I’d heard. I found that the NBC News Bureau in Israel was run by a former coworker of mine during my years in Milwaukee, so I called him one day. To my utter amazement, he confirmed completely the story I’d been told in Amman, that cars filled with armed Israeli “settlers” regularly drove around the West Bank killing Palestinians with impunity. How, I asked him, was it that I’d never heard of, much less seen, such a story? Why, I asked, didn’t he do stories on such things? “We do them all the time,” he responded, “but they get spiked in New York.”

So there it was, right in front of me, and I still had trouble believing such atrocities. I began to look deeper and seek out sources of information beyond the mainstream. My family was a great help, for the entirety of the Arab press wrote about such. I found Mondoweiss, an online publication specializing in stories about the Palestinian crisis but told from the perspective of non‐Israelis. It is quite an eye‐opening experience to subscribe to the daily Mondoweiss newsletter. There’s little attempt at balance here, but reading it helps me realize that there still is a remarkable “other side” to the story we are fed by Netanyahu, the Israelis, and the American press.

The confusion lifted, and my view became clearer and clearer the more I investigated via the web. One thing that had colored my view was my history working with Pat Robertson and The 700 Club. We owned a radio station in Lebanon and gave aid to the Marjayoun Hospital (of which the IRS was concerned). We were “with” the Israelis every step of the way, but not because we were in the least concerned about the conflict involving Palestinians. Rather, we were in for a pound, because we preached (as did other evangelicals) that 1948 was a fulfillment of Biblical prophecy regarding the return of Jesus Christ for his 1,000 year reign (depending on your view of the Rapture). Israel had to return to Jewish Nation status before this would happen, so we preached that the end was near. Moreover, his return has to be in Jerusalem, which is why Christians are so happy with Donald Trump for recognizing Jerusalem as Israel’s capitol.

Zionism, the political movement, and Judaism, the religion of the Jews, are not the same thing, no matter how the Netanyahu government presents it in discussions of antisemitism, the expressions of those who “hate” the Jews. Israel is not a theocracy, and its government is certainly of man. It’s okay to criticize Zionism without being automatically labeled anti‐Semitic, although Netanyahu wants the two connected for propaganda purposes.

The defense of Zionism begins with the Holocaust, and Israel’s right wing is quick to reference it and to do so with regularity. Zionists need the connection to maintain any semblance of moral high ground in denying Palestinians any rights whatsoever. Consider the IDF celebrity Elor Azarya, who served just nine months in prison for the extrajudicial execution (a.k.a. murder) of Palestinian teenager Abdel Fattah al‐Sharif. He was convicted of manslaughter, but the people of Israel refused to accept it. Here’s a part of what I wrote in December of 2017:

The people of Israel — not just the government, the people — want Azarya released, because they view him as a hero and his extrajudicial execution of a Palestinian teenager in the streets of Hebron last year as completely justified. I’m serious. Azarya was 19‐years old when he blew the brains out of an incapacitated and bleeding Palestinian who was lying prone on his stomach in a pool of blood. Azarya pulled his rifle, walked a few steps to get close to his victim and shot him in the head. All of this was caught on videotape. This blatant murder was reduced to manslaughter with Azarya sentenced to 18 months in prison, four months of which was immediately suspended. The people of Israel want him released, and the latest news is that Israeli President Reuven Rivlin might just pardon him. Israeli Prime Minister Benjamin Netanyahu has called Azarya, “Everyone’s son” in calling for his release. You should also know that there are questions about the belief that Azarya’s victim was, in fact, the man who attacked an Israeli soldier with a knife on the day he was executed. The whole mess stinks, and yet Azarya’s smiling face is plastered all over the country as a symbol of the fine young men who defend Israel and her government.

Forensics revealed that it was Azarya’s bullet that killed al‐Sharif, but it didn’t matter. This is a blatant example of Israeli treatment of Palestinians but by no means unique. Many of these murders have been captured on videotape, but no one in the West is moved whatsoever. It’s just too darned confusing.

Americans ARE confused by events, because everything we read is driven by the Israelis and their propaganda practice, hasbara. Although Zionism has been around since the 19th Century, it was the 20th Century and the German Holocaust that energized it in such a way as to bring about the modern nation of Israel. For Israel to be justified, it must continue to lean on the Holocaust in such a way as to present itself as a lamb surrounded by wolves.

It is hardly that. Israel has nukes. Israel has a powerful military with cutting edge technology and weaponry that’s the envy of the world. It also receives from the U.S. $10 million each and every day (weekends included) to sustain its edge in controlling its corner of the world.

And, so, the question that needs to be asked most is “what do we get out of this?” It’s a fair question and one that journalists shouldn’t be prohibited from asking. And, perhaps if that happened, we wouldn’t be nearly so confused as we are.

Big J Institutions Ignore the Digital Truth

January 2019 was another tough month for media companies struggling with ongoing revenue declines. Layoffs came in bunches as Gannett, Buzzfeed, Verizon, The Huffington Post, and others tried to balance the books against losses on the inbound side of the ledger. The problem, however, isn’t those always‐evil “market forces;” it’s now and always has been an inability to correctly read the declines and respond accordingly. To use a very old illustration, if the railroads had known they were in the transportation business, they would’ve owned the airlines. But, no; they assumed they were in the railroad business, which allowed the disruptors in. Same with media: they’re not in the news business; they’re in the advertising business.

The digital advertising market is far bigger than local media companies understand, and this remains the top obstacle in all efforts to “save” local media (and “media” in general). The most baffling element of this is how these companies refuse to even compete for all the dollars locally, choosing instead to compete only for dollars already spent on their models. As a result, the local digital media market is only 15 percent of what’s available totally. And, the saddest part of all of the indictments of media managers is that the market is growing while the traditional forms of advertising are shrinking, so you’d think these corporate managers would want a different business model. They don’t.

Nobody knows this better than Gordon Borrell, the man who provides the measurements for how well or bad these companies are doing. Borrell provides details sliced many ways, but perhaps the most revealing is his recent data on what he now calls the “addressable” digital market. This is a percentage of the total digital advertising market that believe the local media company sales pitches and spend money with these companies. This figure is the share of the market that media companies serve, and it has been shrinking for as long as I’ve known Gordon. What media companies don’t seem to understand is that their model is inefficient, because it’s based on the archaic marketing rules (reach/frequency) governing mass marketing. Meanwhile, digital pure play companies (those who exist to provide targeted ads to individual browsers based on those browsers’ history) get 85 percent of the total market.

In a webinar last month, Borrell provided data about this “addressable” market, and while it remains a big number, it’s nowhere near the overall marketplace. Here’s a snapshot of that (provided to us by Borrell), and it shows the futility of chasing only those dollars spent with a mass marketing model.

Data provided by Borrell Associates

This graph reveals that all of these media companies today are competing for only 15 percent of the total market. In Texas, they call this “dumber than a bucket of hair,” but Borrell is much more circumspect, calling this obvious failure a product of the environment in which local media companies operate. That’s fine, but shallow industry thinking is never created by “the devil made me do it;” it’s a question of knowledge, tools, and the intelligence on how to proceed.

Many years ago — before I went to work for AR&D in 2006 — I was invited to make a presentation to a media group in Tampa. Sweeping changes were just beginning to impact their business, and they wanted a summary of those changes, so they could figure out what to do. At the end of my session, I was asked a question that completely altered my focus on “the problem” they faced. “This is all great, Terry,” the top dog said, “but where’s the money?” I didn’t have a very good answer for them, so I spent the next 10 years studying the question. The conclusion I reached early on was that if these companies continued to proceed with only mass marketing as their model, they would soon fade from relevance altogether.

So, to me, the issue wasn’t about content, because while media content was certainly being disrupted, the blow to their business model was the only one that really mattered. Nobody listened, in part because these companies are run by mostly older men, who seek first to help themselves and their families in a comfortable retirement. Rocking the boat isn’t conducive to that end, and this is another part of Gordon’s “environment” that contributes to making foolish decisions at the top.

On another occasion, I was making a presentation to the top managers of an east coast media company. Among the strategies I recommended was to get into the local search business. The owner of the privately‐held company was present, and he asked me, “You really want me to compete against Google?” I said, “Of course. Google is competing against you.” The company tried a couple of things I recommended, but their need to move every innovation into their mass media business model proved me right.

I simply couldn’t convince anybody that targeting individual browsers in the community was the Holy Grail of digital advertising and abdicating this to the pure plays was corporate malfeasance. At core, the problem begins with executives believing they’re in the news business. They’re not. They’re in the advertising business, and that’s where their focus should be.

And, here’s the most chilling aspect of this: local media companies are unable to see the impact the disruption to advertising is having on the local communities they serve. Here’s another image from Borrell’s addressable market presentation:

85% of digital advertising money that originates in the community goes to pure play internet companies. That money leaves the market forever. These companies pay no local taxes, employ no local people, contribute to nobody’s community chest, and are a net drain on the economic well‐being of the community. This money drain is staggering compared to what local media companies are getting, and it shows no sign of a reversal any time soon.

Finally, I had a telephone conversation once with a guy from an ad exchange about the possibility of partnering with local media companies. In what was an embarrassing reality, this sales executive told me, “We don’t need to partner with anybody, Terry, because we already have access to 100% of the browsers in any market anyway.” 

You can ask Borrell about all of this for yourselves at his annual Local Online Advertising Conference March 11–12 in New York.

If I owned a local business, I certainly would want my money to go where it’s the most efficient and effective for growth, and all the evidence loudly screams that targeting local browsers is the way to go. The sales pitch of the account exec representing my favorite TV station seems shallow and archaic in comparison. There are no secrets that media salespeople can manipulate to their advantage anymore, and maybe that’s the real problem.

Regardless, managers who wince as if I’m calling their baby “ugly” have only themselves to blame, because the things I preached back then have certainly all come to pass. It ain’t rocket science, folks, and here’s a final prophecy for consideration. Either local media companies band together to attack the problem at the local level, or there will only be room for one “winner” in each market in the not‐so‐distant future.

The Management Culture

Abraham Zaleznik, 1924–2011

My newest topic of study and writing (another book) is one that I’ve touched on many times in prior works, and that is the idea that managers and leaders are completely different personality styles. In 1977, Harvard Business School psychologist Abraham Zaleznik published his brilliant essay, Managers and Leaders: Are They Different?. This paper set forth a line that separates the two personalities, and this has been a seminal document in the education of Terry Heaton.

The reason this is so important is that the managers have had their way since the invention of moveable type, which gave managers the ability to sell their beliefs to wide audiences. Slowly, but surely, the idea that you can manage your way to just about any goal (a management term) has led to disillusionment and frustration, because it’s just not possible to continually manage without the creative innovations provided by leaders.

This is why the financial laws that ancient Israel were given by Moses included checks and balances to prevent anyone from gaining massive wealth or to place any person into poverty. The Sabbath Year and the Year of Jubilee were designed to keep everything honest between people when it came to their financial well‐being, even to the extent of recovering lands they might have lost in the years prior to the Jubilee. The Israelites may have practiced this in the beginning, but clearly they gave it all up centuries ago and turned to profit‐based hegemonies.

What we have today in the West is a management culture, one that’s built on hierarchies and rules, all of which serve the top of the pyramid and not the base. Oddly, the vast majority of the population agrees with this and even votes for those who make the rules that keep them forever at the bottom. Each institution of the human existence offers a process‐driven solution to a problem that’s based on, amazingly, their own core competencies, but this is a public mask for private manipulation. Banking is the most obvious example. Banks hold our money “for” us, but that’s just marketing doublespeak. Banks exist to only serve banking, and the clearest example of that is how those least able to give their money to a bank are punished the most for not playing by the rules. We accept that this is “the way it is,” and the management culture advances.

When managers reach the inevitable wall that such formulaic adherence to rules must produce, those who pay the actual price are the rest of us. What awards managers is growth, and growth has limits. Always. There will come a day when these rules force a stoppage of growth, but to managers, this is just another hinderance that needs correction through management of the bottom line. People identified as “expenses” are summarily dismissed in order to help the guy who managed the destruction get his bonus. There is zero incentive for such people to not step all over others in the name meeting the money needs of the owners (who, by the way, managers have convinced us are the good guys).

The Shirky Principle is even more telling, for it states that “institutions will always try to preserve the problem for which they are the solution.” Drug companies are a great illustration of this, because it’s not always in their best interests for their medicines to provide cures. The pharmaceuticals industry is a terrific example of how the managers at the top get filthy rich in the name of “research” to help the world, but the sheer size of the salaries of drug company CEOs makes such a position utterly false. As a result, the entire industry is rampant with shame while touting the good they do for the community.

The paradox of prosperity, it should be noted, is that discontent increases with opportunities for acting on it.

In the world of music, the management culture inserts itself in a couple of ways. One, managers determine who gets money and who doesn’t, and it all depends on their ability to manufacture (a management term) hits. And, since managers are risk averse, this results in the homogenization of music that sells. Let me be the first to say that the purpose of music among humans is not to make money, but this is the fruit we have from the management culture. Two, the method of teaching music has adapted to the management culture by eliminating the ear from the making of music. In bluegrass music, for example, the invention of tablature puts the complex and fast notes in learnable form on paper. This has produced some phenomenal new 5‐string banjo talent, but everything sounds the same. Pickers that stand out are guys like Jim Mills, whose right hand work can’t be completely transferred to paper. His ear‐taught methods are unique, even though he can play the same songs note‐for‐note that the tab players use, yet sound dramatically different, because his ear tells his right hand to “punch” certain notes and play others softly. This produces a loyalty to the song instead of the notes, and that’s the nuance that’s lost with only tablature.

Consequently, originality in music has become a niche and not the main market, and this benefits only those willing to be “managed” to prosperity.

Happy with the music industry? Read Joel Rose’s recent NPR article, Why Is The Music Of 1968 So Enduring? ‘It Was Allowed To Be Art’.

“I realized that I was part of the rebellion, and not part of the establishment,” says (author John) Simon, who earned a degree in music from Princeton University before getting a staff job at Columbia Records. “Part of being the rebellion is, you could rebel musically in the studio. You didn’t have to be as formulaic as in the past.”

The management culture copies formulas for success in every walk of life, including, believe it or not, the church. Here we have an institution with little incentive to overcome cultural evil, for that would take them out of business. Instead, the message is always “you need us” in your life for protection against the culture and the possibility of going to hell.” We are taught to believe this is “truth,” so we behave as instructed, which helps the other managers stay on top and in charge. Ask yourself this: if churches aren’t a part of the management culture, why is the goodness of churches heaped on only those that are growing?

The management culture put Donald Trump in the White House. It was inevitable and predictable.

Another institution fully involved in the management culture is medicine. Doctors today are troubled by patients educated by other patients via patient websites who question both diagnosis and treatment. They don’t have time to argue, because other patients are jammed into a queue that’s part of the profit process from other managers. The authority of the doctor is rightly challenged by the spread of formerly protected knowledge, and I always point to the story of Lorenzo’s Oil and a statement by Lorenzo’s father: “The needs of the doctor are different than the needs of the patient.” Healthcare in the U.S. is an enormous mess, thanks to the fine work of the management culture.

I can’t help but think this way after reading The Education of Henry Adams, who notes in the book that “The way of nature is change (chaotic); the dream of man is order.” Order is truly an unreachable dream, because human nature gets in the way. The only way to produce a form of it is to apply force. Self restraint requires sacrifice, and that’s not a hallmark of the human condition, and absent an internal governor, order requires the use of some form of bayonet at our backs. It’s good for the culture, right? Maybe not so much.

The world desperately needs Zaleznik’s leaders, people who are comfortable with a little chaos in the mix. For them, problem‐solving isn’t always based in what worked before. They are fearless in that sense, and can’t be tied down to a specific set of rules to follow. They must have freedom in order to innovate, and the management culture has a serious problem with that. And since progress is judged by those who play by the rules, very few institutions are run by leaders.

And the most ridiculous idea that the management culture perpetuates is that one can follow certain systems or processes to “become” a leader. Zaleznik tried (because the demand was there) with his book You can call a manager a leader, if you’d like, but that doesn’t make her a leader in the Zaleznik style.

The problem is that absent contributions from both, the culture can’t really function as free, for there is a grave difference between the liberty of free people and the license demanded by those at the top of the tower. Can we overcome it? Perhaps, but given the nature and depth of the hierarchy, they won’t give up their positions without a fight, and that conflict could be very, very deadly.

I want to end this with a Bible verse that speaks to the core of this dichotomy, because it strikes at the motive of managing and being managed. It’s from the book of Ecclesiastes, chapter four, verse four (NIV):

And I saw that all toil and all achievement spring from one person’s envy of another. This too is meaningless, a chasing after the wind.

Don’t ever think that managers aren’t aware of this. They exploit it to their own ends, and we just go along.

After all, it’s a management culture.

Presenting the apostles as holy

I don’t usually write of theological matters, because I’m not a theologian. Of course, my philosophy rejects such expertise in the first place, because as far as I’m concerned, theologians tend to embrace what they’ve been taught, and that doesn’t include doubts. It’s the same formula for all expertise based on education, and this is one of the great differences between modernity and the postmodern mind. Experience is elevated above book‐learning to pomos, and this is upsetting to the status quo. I’ve written of this many times, so bear with me as we examine something important about Christianity.

Russian painter, Simon Ushakov, painted his “Last Supper” in 1685. Ushakov was an Orthodox Christian, the primary Christian practice in Eastern European countries and elsewhere. The painting is significant, because it depicts Christ and the Disciples with halos, which was the custom when trying to present these historical figures as holy. The use of such icons is common in Orthodox history, similar to the icons of Buddhism. Only Judas is depicted sans halo.

There are two real problems with this. One, in proclaiming these disciples as “holy,” the church bends the historical narrative for its own benefit, which was to create a dependency on the church for access to God or anyone holy. If we can be convinced that the only way to gain our own halo is to live a “Christian” life in accordance with church teachings, then the hierarchy of the church is not only intact, but it has a permanent place in the lives of the community.

Two — and this is perhaps the biggest difficulty presented with this “holy disciples” narrative — it simply isn’t accurate. These were not holy men; they were just like you and me. Their clumsiness as humans is lost in the recognition of their holiness, and this alters the meaning of important texts that would greatly help each of us on our own journeys through life. The growing of Christianity’s brand, therefore, was based on a fallacy, for in order to receive the good news, we must be convinced that we are unable to become what the disciples became. We need to be kept in our place to assure order, but that all changes, if the disciples were just ordinary people. In fact, to conclude their holiness even in the early church is to lose track of the gospel in a hodgepodge of mixed messages.

The just shall live by faith, unless and except when it comes to everyday living. Then, it’s all about our behavior, and grace gets kicked aside in the need to maintain the institutions we’ve built. That is best accomplished if those 12 men were kept aside as ideals for us to chase. But that was never really the case when Jesus was teaching them, assuming one’s belief in the Bible as a teaching document.

In Luke, chapter 17, for example, the disciples asked Jesus, “Lord, increase our faith.” If these were holy men, they would have no reason to make this request. Their motive would have been to help Jesus in spreading the word through performing the same signs and wonders that He did. But, they were not sanctified and separate from others in their thoughts or their behavior. Therefore, Jesus detects the self‐centered motive in asking for more faith. They’re profoundly impressed with the guy and want to share in His power. They’ve heard him talk about this “faith” thing and don’t understand. So their question really is, “Lord, increase our faith so that we can do the same cool things that you do (and gain an advantage over our fellow man).”

The parable of the mustard seed follows, in which Jesus describes the faith of a mustard seed for them (it simply does what it’s supposed to do) and challenges them with a statement that, if they were to act similarly, they could move mountains or toss trees into the ocean.

Then, however, he shifts to another parable — the unprofitable servant. He essentially tells them not to expect any personal reward for doing what we’re supposed to do — like uprooting trees — because we are nobodies compared with the Creator. So, Jesus saw through their request, and this story has become fully bastardized through this idea that these men were holy simply because they followed Jesus in the beginning of His ministry. Many translations, for example, reference the size of the mustard seed as depicting just a wee bit of faith, which is a bit like being a wee bit pregnant. In for a penny; in for a pound. The decision to think and act for the benefit of others is contrary to our nature, and Jesus certainly knew that. He was fully man and fully God, according to the book, but isn’t it amazing how rarely we speak of his humanity and the internal conflicts He must have known? We need to think about it, however, because the book — especially the New Testament — takes on new meaning and significance if we can understand that the only holy man at the time of the disciples was Jesus himself.

The story of Jesus being tested in the wilderness is a great example of the humanity of Jesus. You know the story: Jesus was dealing with the torture and death what awaited him in Jerusalem. He was suffering internal agony over the conflict. If we’re to examine only His godly nature, we think that he was actually confronted by a devilish character who tempted him to avoid crucifixion by bowing down to the devil’s wishes. This makes for chilly — almost Tolkienesque — imagery, but it’s much more likely that these ideas — the stone, the leap, the rebellion — were birthed in His own head by his own ego, for that is the realm of man wherein temptation resides. This makes the story much more relatable to all of us, for who hasn’t heard the voice of his own ego?

It’s too risky. I might get hurt.
It’s really not my ministry.
Surely, I can take just one drink, right?
Nobody will see me in the adult video store.
I’m not worthy of getting a raise.
I’m special and good.
I’m special and bad.
My intentions are good.
It’s too hot/cold to keep my commitment.
He’ll never miss that $20 I owe.

So Jesus is hungry. Bang, he entertains the thought of changing a rock into a loaf of bread. The lesson doesn’t change whatsoever, but the scenario becomes much more relatable, because we’re no longer forcing ourselves into a magical illusion in the name of holiness.

The Protestant rebellion against Rome was based on the belief that “the just shall live by faith” and that grace, not works, was the path to righteousness in God’s eyes, thanks to the sacrifice of Jesus. The reasoning was simple. The law of Moses could not be kept, so God, through His unmerited favor, intervened with one final sacrifice for all of humanity. The problem for Rome, of course, was that such a stance gutted the hierarchy in place with the church. Controlling the behavior of citizens on behalf of the haves of the culture became its “business” model. Its value proposition was it stood as God’s intermediary with His people, and God only worked through its priests, cardinals, and bishops. The church sold its blessings to the highest bidders and became the absolute governor of human behavior by promising the “right” pathway to heaven.

And if people can be persuaded to accept their lot in life as God’s plan — and that their reward for this comes in the afterlife — then the masses can be controlled on behalf of the rulers of the culture. This is what we cherished for hundreds of years.

Protestantism stood a chance at replacing the power of Rome, but corruption reared its ugly head in the name of Protestant evangelism. Destroying entire civilizations in the name of God became morally acceptable, as long as it meant growing Christianity worldwide. In establishing the faith as a dynasty, it was important for the church to picture “holy” humans with halos to justify its unique separation from the filth of humanity.

And, as long as we covet our own halos, we’re susceptible to manipulation by the alleged grantor of halos, namely the church.

Jumping the Shark: Criminal Minds

Let’s review. I’m an old guy. I mostly watch crime dramas on TV, which is typical for my age group. I’ve watched them all and have seen my share of programs come and go, but Criminal Minds has been one of my favorites for a very long time. It’s sad to see articles like this one from a Zimbio list of TV shows that are likely to get cancelled next year:

While the first 13 seasons of Criminal Minds received an average of 23 episodes per season, Season 14 garnered a mere 15‐episode order. The long‐running CBS drama will soon reach its 300th episode. All of that means nothing if ratings are down. Season 14 debuted to 4.45 million viewers, easily making it the lowest watched episode of the entire series. After suffering the loss of major characters like Aaron Hotchner and Derek Morgan, perhaps it’s time to write the final chapter for Criminal Minds.

I’ve been wanting to write about this for a couple of years, because the show has had a serious chemistry problem since Thomas Gibson (Agent Aaron Hotchner) was booted for kicking a producer on the set. When that was followed by Shemar Moore (Darek Morgan) leaving to star in his own drama (SWAT — it’s awful), it flipped the chemistry and removed all the dominant macho male characters.

There’s an old saying in television that it’s not who goes who impacts the program; it’s who comes in as replacements. It’s the one thing that producers can control, and the producers of Criminal Minds blew it completely with the new agents. The show used to be built around these strong male characters, but that’s all been replaced with mush, and despite the efforts of the remaining cast members, you can’t fix a chemistry problem with just good acting. Chemistry in a crime drama influences everything and especially the writing.

They’ve injected macho into Dr. Spencer Reid’s character. Doesn’t work at all. They put Emily Prentiss in charge of the unit, but as strong as she is, she simply cannot replace the loss of Hotchner and Morgan. Producers brought in Adam Rodriquez and Damon Gupton to fill the macho void, but it doesn’t work, because Rodriquez oozes empathy, and Gupton is, at best, warm milk. This creates an impossible task for the writers, because they’ve got to know it isn’t working.

They’ve also botched the character of David Rossi and turned him into a bit player instead of the former founder of the BAU who was brought to the team after the departure of the original Criminal Minds guru Manny Patinkin (Jason Gideon) after two seasons. Patinkin’s character was deep and dark, and that set the tone for the original scripts. I honestly can’t watch some of the first two seasons, because the shows we’re just too dark. So Patinkin up and quit over creative differences. They brought in a clone, Joe Mantegna, to play David Rossi, said to be Jason Gideon’s partner in the creation of the FBI’s profiling unit years earlier. The chemistry of the cast after hiring Mantegna was, in my view, just right, in fact, perfect. It was this cast that led the show to its position atop the crime drama genre. Sadly, that’s all gone.

Even the show’s oddball personification of love, the adorable Kirsten Vangsness as Penelope Garcia, has been negatively impacted by the loss of the Darek Morgan character. Without her lusty and charming relationship — as a submissive female — to Shemar Moore’s rock solid dominant in Agent Morgan, her character now flaps in the breeze of nothingness. We know that his loss is her loss, and none of the current cast members is able to fill that void, and so, it’s just gone. It’s eliminated the tension of their loving “babygirl” relationship.

Thomas Gibson, the story goes, was very difficult on the set: demanding and angry when he disagreed with production. Here’s how Wikipedia describes his departure.

On August 11, 2016, Gibson was suspended after appearing in two episodes of the twelfth season of Criminal Minds, following an on‐set altercation with a writer‐producer; he apologized for the confrontation in a statement, claiming the dispute arose from creative differences in an episode he was directing (Gibson had previously directed six episodes of Criminal Minds since 2013, along with two last season episodes of Dharma & Greg in 2001). Gibson had a prior altercation with an assistant director and underwent anger‐management counseling.

You know what they say about hindsight, but the truth is that temperamental artists are a part of the creative process and a wide berth is very often a necessity. No matter how ugly this kicking incident was, it wasn’t worth destroying a top‐ranked television drama, but that’s exactly what has happened. CBS blew it with one of its top products, and, as a fan, it’s really agitating and unbelievably sad to watch the whole thing just crumble.

But that’s the way it goes with television.