I have recently availed myself of some forums dealing with English grammar or usage, and it seems they are inevitably besieged by folks with the most rigid imaginable position on acceptable writing. I’m always intrigued and amused by the discussions of grammar rules by people who are full of insistence about what “must” be done when writing English.
Or else what?
Fewer people read and write the way you want?
It’s handy to refer to language conventions as rules, but they really aren’t. None of them. There is no universally recognized authority on what is right or wrong in English, and therefore there is no true right or wrong. No rules. If I want to publish fiction or be taken seriously in a technical report or what have you, I will conform to various sets of conventions, but that is entirely for my own benefit and subject to my own whim. If I didn’t care how my writing was perceived, I may be far less inclined to do so, and without consequence. Does it “degrade” society that so many people make so many glaring errors in all types of communication, from business emails to Facebook to advertising? You may think “yes,” but that would also be simply an opinion.
I used to have that opinion myself, and am still affected by strong vestiges of its influence. I am not immune to my own decades of training in school and at home that developed a concept of meticulous adherence to the conventions of “proper” English. Meme after meme goes by on my Facebook news feed with errors in spelling, punctuation, or usage. Dismayed, I typically refrain from clicking “Like” on those posts, even if the thought is clever. I’m not talking about someone’s typed post language. That’s just all over the map. I mean artistically produced memes that have nicely Photoshopped graphics and overlaid messages of wisdom in custom fonts and all that. More often than not, those have grammar errors. In fact, I’m pleasantly surprised these days to see one that is correct. My first thought is always that it’s too bad they spent all their time on graphics and not two minutes on a spell check.
All that is just my own prejudice, though. I consider those thoughts and behaviors to be residual flaws in my attitude about English, but I haven’t forced myself to change them yet. Well, some of them. I’ve made progress.
To understand why none of that matters, one must merely look to the purpose of language. Accuracy is relative. Style is relative. Language can be treated as art, and there is certainly nothing wrong with that, but that is not the purpose of language. The purpose of language is communication. All else is luxury. Some indulge in luxuries, some do not. The elegance of society may be affected by the relative level of indulgence in linguistic luxuries, but the functioning of society is not. Since elegance is itself a luxury, I submit that society is not inherently degraded by the preponderance of grammar errors in the ever-increasing distribution of written language.
That addresses the consequences of obeying the rules or not, but what about the rules themselves? What is the true veracity of the laws of grammar? The authority of any rule of language has no more veracity than what the users voluntarily allow it to have. If everyone ignores a rule, they’re not all breaking it. Rather, the rule simply doesn’t exist. It can only exist if people acknowledge it, since there is no recognized authority to establish or enforce such rules.
The language changes; that much is fact. The value of that change, positively or negatively, is in the eye of the reader. Until there is no Chicago style vs. AP style or anyone else’s competing textbooks, and there is one and only one recognized and accepted authority for English, all “rules” about all aspects of the language will be nothing more than guidelines.
Some self-appointed grammar police get it, and realize that spelling and grammar are pet peeves to them. Not to everyone. Others, however, think they’re somehow protecting society from evil by complaining about bad grammar, and that’s just annoying. And incorrect. Yes, the language is changing. It’s a living language. Yes, it is changing primarily due to laziness. Individual laziness at numerous stages of life, from not paying attention in English class to not caring to look something up before they send it out wrong in a text or Facebook post, or even business correspondence.
Does that laziness annoy me? Yeah, it does. I frankly find it astonishing that the level of writing competence in people who are supposed to be professionals has slipped to the embarrassing level it enjoys today. But it’s just me, and I know that; nobody else is annoyed when I read a badly constructed email. I’m the only one reading it.
That’s where this thing comes in about protecting society: no one is harmed by a person with bad grammar except that person. When a colleague sends me a sloppy email, I might make a mental note about the intellectual trustworthiness of that person, but it’s unlikely I will truly behave any differently with them than I would have after reading crisp, businesslike correspondence. Why? Because I am a professional. So, if a professional person such as myself reads an email pieced together in Frankenstein’s lab, nothing happens because I am professional enough to treat the sender professionally in spite of the email. If an unprofessional person reads it, nothing happens because they don’t recognize that there is anything wrong with the email. (And yes, I realize I am equating “professional” with “understanding grammar,” and that is opinion-based, but I do that because, in my opinion, grammar is indeed a prerequisite for behaving professionally, in virtually any profession.)
The upshot there is that there are no consequences to speak of for the sender of that egregious email. Their laziness will propagate and eventually change the language. This is the point where the mildly disturbed grammar cop might remind me that I said no one is harmed by a person with bad grammar, and yet here we’ve all been harmed by a change in language brought about through laziness.
Well, here’s a tidbit: all language change is brought about by laziness. Laziness or ignorance. No living language has ever gotten more regulated or tighter in its usage through the spontaneous behavior of its users. That is contrary to the laws of entropy of the universe. Left on their own, things deteriorate; they tend toward chaos, not order. And for all intents and purposes, language is left on its own. Once kids leave the classroom, they become subjects with the keys to the kingdom. They have infinite write access to the English wiki.
What’s more, English is used by an ever-broadening base of humans across the globe, and it is the primary language for a smaller and smaller percentage of its users all the time. What changes may not creep in through the laziness of primary users may happen through the sheer numbers of people for whom English is a second language. For instance, I don’t know for sure that the common use of the word “install” as a noun came from ESL influences in technical circles, but it certainly could have. Secondary users of English nevertheless use it regularly, and their shortcuts or mispronunciations and the like become commonplace, then accepted, through people with no pretense at all about using the language correctly. They know full well they don’t. Yet mainstream users pick up their habits and before you know it, the language has changed. It’s just the nature of the beast, and if that freaks you out because the whole situation is out of control, give it up. It may be “out of control,” but rest assured it is more in control today than it will ever be again. Entropy, you know.
Bottom line is, if good grammar is a hobby of yours, great. Indulge. If others’ bad grammar irritates you, you have my sympathies. If you believe widespread watchdogging of grammar is an earmark of an elegant society, I may not argue with you there. But the relative benefit of living in an elegant society is a matter of taste, so when good grammar becomes a judgment about value as a citizen, I take issue. Believe it or not, I have actually been told by grammar police that if I know the rules, I have a responsibility to society to follow them and correct others who do not. That is utter hogwash. To those people I would ask the ubiquitous question from the grammatically challenged: “Well, did you understand what I meant?”
The equally prevalent response from grammar “guardians” would be, “But … but … that’s not the point!” But nothing. Of course it’s the point. It is, in fact, the singular point of language. Grammar and spelling conventions exist for the sole purpose of standardization, so as to more easily facilitate the communication of meaning. All else beyond meaning is the luxury of linguistic art or the pursuit of academic hobby. So get a grip, relax, and quit pestering me with the idea that I have betrayed humanity if I end a sentence with a preposition. I haven’t, and that’s not what language exists for.
Well, here we are in 2013. While there was nothing horrible about 2012, I’m not terribly sorry about leaving it behind. For strange, inexplicable reasons, my writing suffered greatly in 2012, and I plan to remedy that this year.
In particular, I found that when I made a goal of 90,000 words of new fiction last July, that wasn’t a great strategy for motivation. I thought it would be, but I’m still figuring out what makes me tick as a writer, and I discovered that getting behind in a long-range strategy like that made me feel like giving up a lot more than it prodded me to try to catch up.
Since then, I’ve thought about different goal-setting, and I’m making some adjustments for 2013. For starters, I do plan to have word-count goals, but all based on weekly output, not some three-month projection to have an entire novel completed. This year, I’ll be looking at 5000 words per week. Not a huge amount. It’s workable. Plus, assuming I’ll succeed, that will produce 250,000 new words over the year (with a couple weeks of vacation thrown in). Should be roughly three full novels. That will be a lot more than 2012, which is good.
I’m also planning to shoot for a lot of short fiction. I’m aiming to publish a total of 25 titles this year; just over two per month. We’ll see how this all works out, since three novels and 22 short stories is more like 350,000 words, but I’m primarily focused on the quarter-million word count goal. Not as an annual target, though. As a weekly target of 5000 words. If I come up short one week? Not to worry. My goal period resets every Saturday morning. I can’t be behind for more than a couple days at any one time. That gives me the psychological opportunity to kick-start my writing and achieve a goal in a shorter time frame.
Some of this feels like I’m playing mental games with myself and I shouldn’t have to do that, but that’s just how people’s brains work, I guess. That’s why goal-setting is so important for most people. Without goals, you don’t know when you’re doing well, when you should feel encouraged by your productivity. I like my new approach for 2013, though, and that’s the bottom line. Whatever gets me motivated to write more.
I have other plans as well – with luck, some more audio work. I haven’t turned either of my existing novels into audiobooks yet, and I’d like to do that, and expand to some other audio work for other authors. We’ll see how that goes. Writing new words comes first.
So, on to a great, new year! Week One is here – only 5000 words to go – piece of cake!
By saying WAR is not a statistic, I’m not trying to make an esoteric statement about the human view of armed conflict. I’m referring to Wins Above Replacement, the popular baseball term. WAR is what is known as a sabermetric, so named as a mutation of SABR, or the Society for American Baseball Research, and “metric,” meaning (in this context) the measurement of organizational or personal performance.
WAR uses a complicated intermixing of data and other information to attempt to embody, in one number, the overall value or worth of a baseball player. It purports to combine offense and defense, in a way that actually compensates for the relative importance of the position a player plays and the team and league he plays in, and compares that against the hypothetical contribution of a replacement player. This replacement player is considered to be someone approximately at the lowest level of a major league player, such as someone just called up from AAA ball.
A player’s WAR value, then, is supposedly the number of wins that player is responsible for contributing, above what the team would win if a replacement player were in his position.
I appreciate the curiosity of such a number. There is a great allure to having a single indicator that accounts for everything a player brings to a team, making things like trade negotiations easier. Where I have difficulty is when this indicator is considered and treated like a statistic. It simply isn’t.
A statistic is the mathematical interpretation of data. In the purest sense, even home runs and RBI’s are not statistics. They are pure data. They are counted events. There is no manipulation of information at all. Batting average, on the other hand, is a statistic. It is the result of dividing base hits (data) by at-bats (data). The same is true of metrics. Metrics are measured values or the results of formulas for which the inputs are measured data.
WAR includes assumptions and subjective assessments of things like the relative importance of the position a player plays. Once such subjective assessments are introduced, the output is no longer a statistic. It is merely an indicator. What’s more, it is an indicator that includes an enormous number of variables. The natural tendency of those who swear by WAR as the best indicator of a player’s value is to believe that the more variables included, the more accurate an indicator it is. The opposite is true. The more variables, the greater the margin for error. That is a mathematical reality.
Considering the subjective inputs and the margin for error, it is somewhat disappointing to know that WAR is increasingly accepted as an important evaluation of players among fans, some of whom believe quite strongly that WAR should be the primary consideration for, among other things, the Most Valuable Player award. This is a misguided mistake. I’ve seen such fans argue that General Managers use WAR for roster management, and if it’s good enough for them, it’s good enough for all as the accepted yardstick. If a GM likes WAR, fine, but that is merely a business decision to accept that risk. That doesn’t mean it’s a good decision, or that it should become a standard of the sport in general. Maybe some day we will all regard it as such, but it’s far, far too early, and WAR far too unproven, to be leaping to such conclusions.
Wins Above Replacement is not a statistic. It is not a metric. It is an available indicator; a seriously imperfect curiosity, nothing more.
Writing is an interesting business these days. Ten years ago, the business talk among writers would have been all about which agency was picking up this or that writer, which New York publisher was giving this or that deal. Now, it’s all about arguments. And they all come from the same basic issue: traditional versus indie publishing.
OK, that’s not really true, of course. I have writer friends with whom I don’t argue about this at all. Reading Internet articles and blogs and comments and such, though, it just feels like everyone is arguing.
Admittedly, not all writers are polarized on this. The smart ones are well-prepared to explore either route, depending on the circumstances, what sort of written piece they’re peddling, etc. Lots of variables there. But some are definitely polarized, and the discussions can get heated.
Some on the traditional side of the fence would even wrinkle their nose at the term “indie” publishing, saying it shines a light of legitimacy that doesn’t belong on what is essentially self-publishing. Is it self-publishing? Well, sure. By definition, if I wrote something, and I am publishing it myself, I am self-publishing. Does it deserve a light of legitimacy?
For myself, I’d certainly like to think so. Regardless of the name, self-publishing is not inherently bad. And the term itself carries less of a negative connotation than it did a couple decades ago, but it’s still looked at a little dimly. “Indie publishing” is the new, connotation-free term for it, and in all honesty I would have to admit I prefer it. I’m not a wannabe writer taking advantage of new tools to throw anything I type out onto the market to say I am published. I would balk at calling myself a professional writer, just because I don’t yet make a living at it, and actually don’t yet make even a noticeable contributory income from it. But I know people who would call me a professional writer, because I am in every way taking myself seriously as a professional, investing in learning the craft and actively striving toward a long-term goal of living off of my writing.
This striving for professionalism extends into publishing. When I began releasing my own work, I didn’t just toss it out as published by the author. I created a publishing company, Lardin Press, and all publishing activity is operated through that company. I take the publishing side of the business every bit as seriously as the writing side. I have also invested in learning the business of publishing. Copyediting, elements of cover design, back cover copy writing, developing print editions along with electronic, pricing strategies, and all sorts of things go into approaching this business as a publisher.
Those that are skeptical about taking the term “indie publishing” seriously are focused more on the writers who are not really attempting in any substantial way to be a publisher. They’ve written something, they want to see if it can make a few bucks, and they stick it out into the marketplace. In all likelihood, the production quality is going to be fair to poor. The writing is impossible to say – could be stinky, could be brilliant. But odds are, if a writer really finds the business side distasteful and does not pursue it in a meaningful way, the production value will likely be low. What the skeptics would insist is “self-publishing,” not “indie publishing.”
Do they have a point? Insofar as there is indeed a wide range of quality in published books nowadays, I’d have to say yes. Does it matter that they have a point? Only if they think the point isn’t that low-quality publishing exists, it’s that anything that is not done by New York is inherently low-quality publishing. That attitude chaps my hide a bit. If I chose to push the publishing activity harder, I could expand and do it for other writers. I could become a small press with multiple clients, just like a number of other small presses, and could compete in today’s marketplace with the skills that I bring to the table today. I can either perform or know where to go to find all of the required specialist activities to produce a smashing book. So far, I have chosen to publish only my own manuscripts, but the quality of publishing could compete with others, and in that sense, my books are absolutely being professionally published, even though I am doing it myself.
So, to me, my books are “indie published.” Lardin Press is an independent publisher, meaning not part of a conglomerate or a subsidiary of any other enterprise, and is publishing Joe Cron’s books. It so happens that Joe Cron owns and operates Lardin Press.
But the rub about the relative quality of indie-pubbed books isn’t even the main body of the argument on the subject. The argument is about whether or not indie pubbing is even a good strategy. For someone in my position, it’s really a no-brainer. The time alone that it would take to get manuscripts distributed, wait, wait, wait, get an offer, hire an IP attorney to review the contract, and come to a settlement with the publishing house is incredibly daunting and would likely take at least the first two years of potential income off the calendar of indie publishing. Even more daunting is the fact that, given the current propensity for unfavorable contract clauses coming out of New York, the odds that I could actually agree to sign a deal are astronomically low, so that manuscript just ends up floating around in Slush Pile Hell for, well, the rest of my life. What on earth good does that do me? Like I said, in my position, it’s a no-brainer that I need to indie pub all novels.
Short stories are a different animal. So might it be to enjoy a different position with NY pub houses. If I came into this ebook world with a long history with a New York editor, maybe negotiating a palatable contract wouldn’t be so difficult. I wouldn’t count on it, but maybe.
As it stands, though, I’m coming into this as a writer and businessman looking at the new situation and drooling at the prospects. There is almost no aspect of today’s rapidly morphing publishing industry that does not favor writers. Writers who are willing to grab on and deal with it, that is. Writers who are used to being taken care of by agents and such are nervous, and those are some of the ones who staunchly defend traditional publishing. They are afraid they’re little world is going to get shaken apart, and they won’t know how to handle being a writer and businessperson.
Here’s a clue, writers: the day you decided to become a writer, you decided to be in business for yourself. You are self-employed. Everyone you come in contact with is negotiating with you as an independent party to all contracts. Agents, publishers, you name it. If you choose to sign your life away, so be it, but it has never been anyone’s business or decision but yours. There has never been anyone to take care of you, and most of the people who tell you they’re going to are putting you in a position to be taken advantage of in potentially heinous ways. You are and have always been responsible for your own career.
Well, well. Believe it or not, the point I had in my head when I started off on all this argument stuff wasn’t the nature of the argument or how any of it irritates or frustrates me (though aspects of it definitely do). It was quite simply that it is totally wicked to exist in a publishing industry in which such arguments can even happen. Earlier in my life, I could never bask in the opportunities I have today. So step up the arguments, because the fact that there is an indie-publishing side of the coin to be reckoned with is amazing to me, and is giving me a brand new career. Bring it on!
As I suspected might occur, it’s been longer than I hoped since my last post. That’s all right, though. It’s because I’ve been writing this past week, and there’s no better reason for missing a blog post than writing stuff I can actually sell.
Starting back on July 1st, I set myself to a challenge. I’m already a little bit behind, but there’s lots of time to catch up.
I think I might do better with an actual goal that I put on myself. It’s a different frame of mind. I’m pretty sure that even if I miss the goal, I’ll have written more than I would have without it, so that’s an improvement, at least.
But I’m getting way ahead of myself. I’m talking as if I already haven’t made it, or are hopelessly behind with no chance of meeting the challenge. If I get into that position, then I’ll start talking about the “well, at least I…” scenarios. For now, I’ll still talk about it with the assumption that I will meet the challenge.
And why wouldn’t I? It’s pretty basic. Simple, simple stuff. Write 1000 words a day. That’s it. For ninety days. Actually, I’ll give myself a few extra. I started on July 1st, and I’m giving myself until October 1st to write 90,000 words. Actual fiction writing, that is. Not blog entries or outlines or notes or anything like that. Fiction writing output. A thousand words a day for ninety days.
On the one hand, that is totally child’s play for any dedicated writer. It’s ground-floor production. A full-length novel in three months. Or the equivalent – some short stories, chunk of a novel, whatever it ends up being, as long as it’s 90,000 words of it. If I can’t do a thousand words a day, I shouldn’t be in this business.
On the other hand, it is nevertheless true that I have not done this at this rate before. Yes, I have been substandard in my output. I admit that. When I took on the National Novel Writing Month (NaNoWriMo) goal last year of 50,000 words during November, I made it, but it was grueling, frankly. It was putting off house projects and such, and writing during what felt like every waking moment. And that was to produce 1667 words a day for 30 days.
One thousand words is somewhat less than 1667, but not a lot less. It’s a pace I’ve never been at before. So the challenge is really about how to get faster and more efficient at writing and working it into my daily life, along with my job, and errands, and household projects, and the like. You know, living. How to maintain a pleasant, satisfying life for myself and my fabulous wife, Jill, and still write a thousand words a day.
That’s what I must be able to achieve in order to look to the future and consider that I will eventually be a full-time professional writer. I hope to be quite a bit faster, actually, but you have to start somewhere. Perhaps the leap from averaging a few dozen words a day to a thousand is too big. I don’t think so. I think I can work this out so it doesn’t feel like I’m spending every waking moment writing, yet I have consistent, regular, workable output.
And I have to get over the fact that any full-time writer with an ounce of diligence would look at a goal of 1000 words a day and double over with laughter. If I get stuck on how pathetic this goal is compared to what it must some day be, I won’t get anywhere. I have to just focus on reasonable goals, see how those work, see how habits can change and such, and go on to the next goal. Baby steps are OK, as long as they are taken.
I will keep as my mantras the ideas that it doesn’t have to be perfect but does have to be finished, and to trust the process. The process will improve, but not unless I do it. Writing doesn’t get better or faster by thinking about writing, or reading about writing (though that has its place in providing tools with which to improve the process). Writing gets better and faster by writing. A lot.
Oh, plus a third mantra – have fun! That’s kind of at the heart of the whole idea in the first place, but it bears mentioning. This isn’t a chore.
This plan has some good inherent milestones, obviously – 30,000 words by August 1st, 60,000 by September 1st. I’ll just dig in, see how this goes, have fun, and evaluate where I am at the milestones.
Today, I’m at 7250 words, on July 8. That’s 750 behind my quota for the time being, but I’ll make it up. That’s what this whole exercise is about – learning how to do that instead of just tossing my hands up and waiting till tomorrow or next week to do more writing, which is what I would do without this goal. So this is a good thing. A very good thing.
Gonna be a great summer!
If you’re a movie producer or director, and you’re thinking of making a movie where the filming perspective is from someone’s personal device, here’s a tip: don’t do it. Just say no.
I’ve seen three movies done that way, that I can remember. There are more, I know, but off the top of my head I can think of three that I’ve seen – The Blair Witch Project, Cloverfield, and Chronicle. There’s another one out right now about somebody filming a party or something, but my fabulous wife, Jill, and I decided immediately upon seeing the preview in that style that we would not be watching that movie. Ever.
Making a film from the perspective of someone’s personal device just doesn’t work. None of three movies I listed work very well, and the personal device is the biggest reason why. And the main reason it doesn’t work is that eventually every movie shot that way must eventually contain scenes of conflict that are totally unbelievable to have been captured on someone’s personal device.
There’s always the scene where the filmer character’s friends are complaining that he’s still filming and telling him to put the camera down and all that crap, and it’s just that – crap. It’s never believable. No one would be filming those types of scenes on a personal device, and no movie I’ve seen yet has made that aspect believable.
Chronicle is the one that swore us off personal-device movies forever, and it did so by going three-quarters of the way through the movie using the teenager’s movie camera, then switching to a point of view of any camera that existed near the location of the action. Changing the device – and by that I mean the movie’s device of seeing everything through the eyes of the kid and his camera – at that stage of the film ripped apart any investment an audience member could have had in the established device.
What’s more, the movie did eventually show scenes from angles that could not have been captured on any available camera of any kind – no security camera, no cop camera, nothing was there. So, the movie makers got themselves in a pickle where the kid’s camera didn’t suffice for the story they wanted to tell, so they abandoned that and went to any available camera in the scene, then that didn’t work out so they tried to surreptitiously show us the action from a point of view no camera could be capturing, and hope nobody would notice. I noticed.
If you’re going to try to do that – and some movies do violate their own devices successfully – you have to so engross the viewer that they don’t notice the violation. Chronicle came up way short of that by having the initial device be one that inherently prevents the viewer from being engulfed by the film (being shot from a personal device), and then distracting the viewer from that device by changing the rules most of the way through the film (switch from the kid’s camera to any available camera). So, when you get to violating the device altogether, you have an audience that is detached from the movie and watching critically, and when you have that, no violation will go unnoticed.
I’m definitely picking on Chronicle here, because it was the worst offender, but the other two I mentioned didn’t work, either. Blair Witch was immensely popular, but probably was so because of the uniqueness of the gimmick at the time. The personal device style did not hold up in that movie, due to the inevitable scenes no one would legitimately film on a personal device. Same with Cloverfield. It’s like people who decide to make these movies have a couple of good ideas about how to do something cool with it at first, then realize somewhere along the line that they actually have to finish the story, and stumble around for a way to do so. It shows. Certainly Chronicle was that way. There’s some amusing stuff between the teenagers early on, but it unravels fast, and I mean the movie itself, not the pleasantness and camaraderie within the story.
I can only hope that the fact that no one has yet been able to make the personal-device movie work right will mean they will eventually give up. For a while, movie makers will see it as a challenge, which I suppose is inevitable, but now I’m to the point where I hope nobody ever gets it right, because if someone does, it will only encourage more of them, and most people won’t get it right. So, here’s hoping all personal-device movies continue to suck, so they go away forever.
I am constantly amazed at the number of professional baseball players who stand at the plate in disgust, or worse, actually say something to an umpire, after they have been called out on strikes.
I played Little League, and City League, and spent a year on my high school baseball team (a team that won the District Championship playoffs after a twenty-game regular season record of 2 – 18!). I was probably around ten years old when I learned that you protect the plate with two strikes. It’s one of the most basic things you learn early.
Mind you, I was not all that good at executing protection of the plate. I excelled with the glove and had a strong arm, and I covered center field with a vengeance – I even did the Mickey Stanley-ish switch to shortstop for one game when our regular was injured – but hitting was not my strong suit.
Late in my teenage playing days, when I realized how comfortable I was hitting balls in the gym from a pitching machine, I began to attribute my inability to concentrate in games to the fact that I have an enormous head, and no batting helmet would fit properly. My brain was always being squashed by the damn thing, wind whistling past the ear holes. I hated batting helmets, and I blame them to this day for my low batting average.
But I digress. The point is that even a markedly sub-par hitter such as myself was keenly aware that taking pitches with two strikes on you was risky. With two strikes, you shorten your stroke and try to get the bat on anything close.
I’ll grant you that professional ballplayers have a better eye for what is in the strike zone than I had as a kid. Now that technology can show us on television whether or not pitches are truly in the zone, it turns out that the complaining players are actually often correct that the final pitch of their at-bat was not a strike.
Doesn’t matter. If the umpire calls you out, it was too close to take. Period. No one to blame but yourself. Suck it up and protect the plate, guys.
Complaining only makes you look like a ten-year-old.
I just watched an enjoyable baseball game in which the Detroit Tigers defeated the New York Yankees, 4 – 3. It was a classic example of one of the primary flaws in the statistical records collected among the vast stockpiles of data in the sport.
The game was tied, 2 – 2, heading into the bottom of the eighth. Miguel Cabrera smashed a mammoth home run to straightaway center – his second of the game over the 420-foot marker – to give Detroit a one-run lead and provide an opportunity for closer Jose Valverde to get a save.
Valverde hit the first batter in the ninth, with the very first pitch. He walked the second batter, then hit the fourth, then walked the sixth, tying the game. The inning ended tied. Every Yankee batter who made contact with a ball got out without advancing runners, yet they scored a run. I enjoy Valverde a lot, but tonight he was completely out of control. It happens.
In the bottom of the ninth, Detroit scored on two hits, a walk, and a sacrifice – details that are of no particular import other than that they provided Valverde with a win. It was perhaps the finest example I’ve ever seen justifying the opinion that the major league baseball “win” is the flimsiest statistic in sports. Surely, there is no greater injustice in statistical baseball than the rule that allows a pitcher to receive a blown save and a win in the same game.
The very beginning of the Tigers’ 2012 season provided two other examples in their first two games. They demonstrated two of the circumstances that most exemplify the meaninglessness of a pitcher being awarded (or not) a baseball victory.
In the first game, Justin Verlander threw eight masterful innings, followed by a blown save and a win by Jose Valverde. What message does that send a relief pitcher? If you’re having a bad day, and you’re so ineffective that you cannot protect even a single run of the lead you inherited, don’t worry, we have an even better reward for you if your team pulls ahead again in the bottom of the inning. You get the win, after what may have been your worst performance of the season. What’s the logic there?
In the second game, Doug Fister pitched scoreless ball for 3-2/3 innings and left injured with a 2-0 lead. A posse of relief pitchers followed, and all did well – as well as Fister, where scoring was concerned, as the game ended 10-0. But they all pitched at least a full inning less than Fister. And the win goes to…ooh, sorry Doug, not you. Starters have to go five full innings to get the win. Too bad. Again, what’s the logic there?
Baseball has an official scorer, to make judgment calls on things like hits and errors. For better or worse, that human interpretive element is there, so why not use it to make awarding victories more justified? Why not bestow the authority on the official scorer to recognize that, in some situations, the pitcher that deserves the win is not the one who would get it under these twisted rules?
The validity of getting a win is viewed with a measure of respect for starting pitchers – though even there the influence of situations outside the pitchers’ control casts a shadow on the veracity of that respect – but it is universally ignored with regard to relievers. Fans, analysts, and the media alike generally don’t even bother discussing a reliever’s won-loss record as a measure of effectiveness. That’s how screwed up the rules for awarding victories are.
It might even help to simply remove the possibility of relief pitchers getting a win entirely. Or a loss, for that matter. Relievers are in such spotty, short-term situations that one lucky inning or one bad one can easily get you a notch or a ding. Neither are a fair assessment of how truly effective the pitcher is, what kind of overall season they’re having, or whether or not you want to put that same guy on the mound again the next day.
There are certainly other statistics in sports that are dependent on teammate cooperation. A football quarterback’s completion percentage, for instance, doesn’t take into account how many passes were simply dropped by receivers. Those show up against the quarterback. If the NFL were to institute official scorers to rule those an error, the completion percentage would be a truer statistic, but I’m not suggesting they go there.
What I’m suggesting is that circumstances and run support and all the other aspects of baseball that contribute to the weakness of the win as a viable statistic can be vastly improved using conventions that are already part of the game. Firstly, remove wins and losses from a reliever’s stats. Starters don’t get holds and saves; relievers don’t get wins and losses. If no one deserves the win or the loss, no one gets one. Who says there must be a win and a loss awarded every game?
Secondly, allow the official scorer to interpret guidelines such that starters that deserve wins will get them. If that were the case, the “win” would be a much more respectable statistic. And Justin Verlander and Doug Fister would appropriately have been 1 – 0 after those first two games, instead of Jose Valverde and Duane Below.
So, my new novel, The Holitaph, under my pen name, Edgar Henry, is released. Now comes the hardest part of writing. Starting the next project.
In the process of writing, once I’m into a story, I hit bumps and snags and places where I have to pause and figure out how things need to flow, but it’s different doing that in the middle of a manuscript than deciding what to start on.
Some writers begin with a premise and a character and plunge in, letting the story carry itself where it will. I have a very difficult time with that. I like to know the end, and that all the major plot points and twists are figured out when I start. For my style, that’s the only way I am confident that the story will have continuity throughout, that everything I am writing is toward a determined climax. Otherwise, I write in fear that I will get 40,000 words in and find myself in a place where nothing works and I have to scrap the whole thing. I guess that wouldn’t be the end of the world, but I’d be angry with myself for wasting that time.
So, here I am, with a number of story ideas but none of them complete in my head. It makes me apprehensive to start one, thinking I’ll just have to change it later. It’s so early in my writing career that I have a feeling I may get more comfortable doing things differently down the road. I have no trouble believing that ten novels from now I will have a different creative style. I welcome that, in fact, since I know my current style is restrictive. If I am determined to eventually make a living as a writer (and I am), I will need more weapons in my arsenal, and that’s likely to mean utilizing different approaches to different stories.
I will also probably get more used to the idea that scrapping a story isn’t a catastrophe. That’s likely to come with some success. By that I mean after I have tasted some success. Today, every story is critical, because time’s a-wastin’ and I have very little available in the marketplace to create an income stream. My first novel took me three years. The second one took seven months, and six of those were fixing the original manuscript draft that I wrote during National Novel Writing Month last November. Seven months is better than three years, but I need to be down around two months if I expect to be able to amass an inventory that will one day sustain me independently.
I’ve been trying to think in terms of a five-year plan, and six novels a year over five years is still only 30 novels. Spread those out over a few pen names and you have only ten new novels from three names in five years. That’s not much, really. And that’s cranking out a novel every two months. (I need to get a lot faster at this, and sustain it.) But to get back to the point at the beginning of the previous paragraph, if I do that, and I have 30 novels out, and I’ve gotten through the creative angst and adrenaline and all that 30 times, I’m likely to be more forgiving if I charge into a story that I have to scrap in the middle. I will have more confidence that the next one will probably work. Especially if those 30 novels are selling at all.
For the time being, though, I need to either get some story details worked out so I’m comfortable mustering something up, or muster up the beginning of a story throwing caution to the wind. The only thing I can’t do is not be writing something. That doesn’t help at all.
Well, here I am with a new website and blog. I’ve started these before, and I get about two posts in, then three or four months go by without having time to write anything, then it seems strange and silly to start it back up. I’ll see if that changes with this one; I hope so, ’cause I’d like to be in the habit of writing more often, even if it’s just rambling in a blog. Keeps me spewing words, and that’s a fair enough thing for a writer to do.
So, to explain the title of this, my inaugural entry: back in October of 2010, I attended a very intensive and amazing workshop by Dean Wesley Smith and Kristine Kathryn Rusch, on marketing tools and strategies for writers. I highly recommend visiting both of their websites if you are a writer. There is a wealth of valuable information, discussion, and perspective in their posts and the comments by other writers.
The workshop was eight days long, and the twelve of us who attended were writing proposals, blurbs, and what-have-you, all day every day. Every waking moment outside the workshop sessions was spent writing stuff that was due at the next session. It was held in Lincoln City, Oregon – an ideal environment for a week of writing – and my fabulous wife, Jill, made the trip down from the Seattle area with me, since we have always loved the Oregon coast.
By Wednesday, the pattern of intently focusing all my time writing was pretty well established. We never go to Lincoln City without eating Gallucci’s Pizza at least once, and that night was pizza night. We had it delivered to our vacation rental home in Road’s End. When it arrived, I got up from my computer and picked it up at the door, then brought it in and began setting up for dinner. Jill said, “Do you have time for this?”
“Frankly, no,” I answered.
“You get to eat…and poop!” she insisted. I relayed the story to the workshop group that evening, to much laughter, and someone said “Eat Poop Write” sounded like a good saying for a T-shirt. I don’t have the T-shirt (yet) but I did have some refrigerator magnets made with “Eat Poop Write,” and it also seemed an acceptable subject for a blog entry.
So much for my first post. Time to go eat and poop, until the next installment…