One-year-old Anthony on floor duty in the kitchen of our old house, circa 2003
We sold our Cupertino house today.
I'm not sure whether our method is the best, but for what it's worth, here's how we sold it quickly, and for a profit:
Buy a crappy, run-down, termite-infested house at what turns out to be the start of one of the greatest housing booms of all time. (I recommend, however, that you know it's termite infested before buying it; we didn't.)
Over the course of the next few years, remodel, pouring more money into it than you paid for the whole property to begin with. Include ample amounts of blood, sweat, and tears. Lots of tears. (Do save the receipts, at least for the money you spend, because it increases your basis for tax purposes.)
Allow enough years to pass so that you clearly miss the peak of the housing bubble, and consider selling your house as the market seems to be starting a death spiral. Be sure to have an over-inflated idea for how much the house is worth.
Fill out all the disclosure forms, but where they have a 2-inch line to answer a question about, say, plumbing issues you might be aware of, write “see attached” and provide reams of information about anything and everything you might suspect. Be very open. (After all, you remember how screwed you felt when the lies of those that sold you the house became apparent.) In the end, supply a report in excess of 5,000 words.
Feel bad when your real-estate agent's first-impression instinct is that your desired price is at least $100,000 more than he feels it might be worth in the market.
Feel worse when, after really checking out the house, doing a lot of research, and consulting with other experts, he suggests an asking price that's more than 2.5 times lower than his initial off-the-cuff impression.
Feel even worse, still, when you add up the 1.5" thick stack of receipts from the remodels and realize that if you sell at the offered price, you'll actually take a loss on the property. Shake head in wonder.
Give away, sell cheaply, donate, or throw away virtually everything you own in the house. Then spend $3,000+ to have “stagers” set up the house with furnishings tastefully arranged to make the house feel inviting.
Put the house on the market on a Wednesday, and have 98 real-estate agents visit it during the “Broker Tour” the next day.
Hold an open house on both Saturday and Sunday, to virtual mobs, with something like 200 people (or groups? I'm not sure) going through. Create lots of buzz.
Prepare to listen to offers on Thursday afternoon (because it's Friday morning where you live now). Hope for multiple offers.
Feel slightly vindicated price-wise when there are eight offers, most of them well over the asking price (three were not serious, offering substantially less than the others).
Of five serious offers, one was highest, but three, at essentially the same price, were clustered in a group not far behind. So, give that cluster a second chance to be highest, and wait an hour or so for followup offers.
Two from the cluster submitted followup offers that were higher than the initial highest. So, go back to the initial highest to give them a second chance as well, but when they decline to raise their offer, go ahead and pick the higher of the followup offers and sell the house to them.
Note with minor satisfaction that the house sold on the first day you would even listen to offers. Officially, the house was listed on the market for 8 days.
After subtracting commissions (don't get me started on the nature of how real-estate professionals are paid for their services) and other fees, realize that with the higher-than-listed price, you actually made a slight profit on the house.
Try not to realize that the margin is so slim that if you take into account the property taxes you've paid over the years, your profit disappears to about a break-even scenario.
I find it hard to fathom how we could basically break even having owned the house during the housing market like we've seen since we bought it in 1998. I can't help but wonder how much we would have sold it for had we not spent a dime on improvements. Certainly, we would have gotten less, but not that much less.
As it turns out, there are two ways I can look at the bright side: one is to ignore the market, and concentrate on the fact that we got out of it all that we put into it (at least money wise — part of our soul will likely stay for some time). Another way is to conveniently forget about what we put into it, and just be happy that we were able to ride the phat part of the housing-boom wave.
(Is it okay for someone my age to use “phat” instead of “fat”, or is that against the Laws of Hip?)
Fumie and I were thrilled that there were multiple offers, but it was also very stressful. We rememberer exceedingly well the feelings we had putting together an offer on a house, the twisted-knot in our stomach as we waited to hear back, and the crushing feeling of having our offer rejected in favor of someone else. It was a horrible, horrible experience. Later we went on to buy this house, but that in no way makes the first experience feel any better.
All the agents (of the serious offers) presented their clients' offers well, making sure that we knew all about the client and how much they wanted to raise their kids in our beautiful house. Two of the prospective buyers even presented in person, telling us how much they loved every aspect of the house. One even called her husband and put him on her cell-phone speaker phone so I could listen in from Kyoto via my real-estate agent's speaker phone, about how he liked the electronic enhancements I'd done. He even commented appreciatively about the tactile transducer that I had bolted under the living-room floor. (It shows that he really read the disclosures I wrote!)
(Those of you who might have ever been to a movie night at my house will know what a tactile transducer is.)
It was gratifying to hear all this praise lavished on my house, and about the nice families that were interested in it, but half of me wishes that I didn't hear any of this stuff. It turns a financial transaction into an emotional one. From an emotional standpoint, I would have been thrilled for any of these nice families to have it (heck, two of the families had kids the same age as Anthony, to within a week or so).
But only one could get it, and I was now faced with having to pick, knowing that the others would feel as crushed as Fumie and I did back when our offer didn't get picked. In any case, the offer that was highest won, but we feel bad for the other families.
Fumie and I also remember how yucky it felt during our second attempt to buy a house (when we bought this house that we're now selling). Our initial offer was countered, and our counter was countered back. The last counter back from the seller (who turned out to be lying cheats) was to eek out an extra $1,000. They risked losing the entire sale (and much goodwill) because they wanted to up the price by less than 1/5th of one percent. What tightwads.
So we didn't want to be like that. Going back for a second round was entirely reasonable, and perhaps it would have been reasonable for a third round, considering that there were eight initial offers and still a lot of interest after the 2nd round. We didn't want to be jerks, so we left it at that. There was almost certainly still money on the table, but it wouldn't have been worth it, emotionally. We'd rather be able to sleep well at night.
In the end, it was the family with the cellphone-calling-wife and electronics-appreciating husband whose offer was the highest. I hope they feel thrilled. I feel good because we got more than our asking price, but realize that they probably have some measure of concern about what they spent because they “paid well over asking,” but I can't wonder if this is a result of a too-low asking price. I wonder what the emotions would be like had we listed it for my initial way-too-high price, and then come waaaaaay down and ended up selling for what we sold today. The end dollar amount is the same, but would the emotions?
Hindsight tends to be pretty good, so it's easy to second guess my real-estate agent. But in the end, I feel fortunate that we could work with him (Marc Roos). Back when we were looking for a house to buy, he personally walked us through more than 60 houses, and was entirely patient with us, and we learned to trust both his skill and integrity. Those feelings were only strengthened during this selling experience. I highly recommend him.
In my previous post I dug up messages someone wrote almost 10 years ago, and in some places challenged what they wrote. In writing the post, I mostly wanted to document the thread that created the somewhat-famous “now you have two problems” quote, and the challenges and rebuttals I included are a bit unfair of me (being 10 years later, with little chance that the author would care to engage now in a debate about writings and technology 10 years old).
So, with that in mind I'd like to refute a comment made yesterday on the Slashdot review of my book:
“Nondeterministic finite automata” is well defined in comp-sci and Friedl has it wrong. The set of languages accepted by NFA's is exactly the same as the set accepted by DFA's.
Perl's engine and its brethren use search-and-backtrack. They accept a lot things that are not regular expressions. Such engines don't have much theory behind them, and it's hard to reason generally about what they do and how fast they do it.
In the context of what my book covers, this is unreasonable on many levels, but I won't respond to it now, I'll respond to it 10 years ago. I wrote the following in 1997, in a post on comp.lang.perl.misc:
|> Perhaps it's now too late, and we need to find a new |> term for what we used to call regular expressions.
Perhaps it's best to differentiate between the calculus phrase “regular expression”, coined by Dr. Stephen Kleene in the 40s, and the use of the same term in modern computer science. There are many situations in language where the meanings of words have changed. Some examples:
- When I was growing up, “How are you doing?” was not a statement.
- A Japanese futon has no relationship to what is called a “futon” in American English.
- Tennis shoes are rarely used for tennis.
- The little eraser-head pointing devices on laptops are called mice.
Ah, well, I could go on forever. But basically, the term “regular expression” has no implied meaning to most people that first come across it in Perl or whatnot, so little confusion arises. It would have been nice had some other word been used from the start (perhaps one giving more of a hint as to what the thing is all about), but for the most part we're stuck with history. I suppose you could start a crusade to use a different phrase. How about 'whachamacallit'? 🙂
Jeffrey
Whoever wrote the comment also wrote a similar comment in March, and believes that the reality of how things work is somehow dictated by the name. I tend to believe that the reality of how things work is dictated by, well, reality, often despite the name. (Names/functions of government offices, like “Internal Revenue Service”, are an excellent example of this 🙂
My book is concerned with the reality of how things work, and I make it clear in the book that the reality is different than what the linguistic historical-baggage might imply to some small subset of people.
To continue the special theme of this post of bringing up 10-year-old writings to rebut current ignorance, here's the sidebar on page 104 of the first edition of my book (the poster having said that he has the first edition), the master source having been last updated on November 4, 1996:
(A similar sidebar appears on page 180 of later editions)
The true mathematical and computational meaning of “NFA” is different from what is commonly called an “NFA regex engine.” In theory, NFA and DFA engines should match exactly the same text and have exactly the same features. In practice, the desire for richer, more expressive regular expressions has caused their semantics to diverge. We'll see several examples later in this chapter, but one right off the top is support for backreferences.
As a programmer, if you have a true (mathematically speaking) NFA regex engine, it is a relatively small task to add support for backreferences. A DFA's engine's design precludes the adding of this support, but an NFA's common implementation makes it trivial. In doing so, you create a more powerful tool, but you also make it decidedly nonregular (mathematically speaking). What does this mean? At most, that you should probably stop calling it an NFA, and start using the phrase “nonregular expressions,” since that describes (mathematically speaking) the new situation. No one has actually done this, so the name “NFA” has lingered, even though the implementation is no longer (mathematically speaking) an NFA.
What does all this mean to you, as a user? Absolutely nothing. As a user, you don't care if it's regular, nonregular, unregular, irregular, or incontinent. So long as you know what you can expect from it (something this chapter will show you), you know all you need to care about.
For those wishing to learn more about the theory of regular expressions, the classic computer-science text is chapter 3 of Aho, Sethi, and Ullman's Compilers — Principles, Techniques, and Tools (Addison-Wesley, 1986), commonly called “The Dragon Book” due to the cover design. More specifically, this is the “red dragon”. The “green dragon” is its predecessor, Aho and Ullman's Principles of Compiler Design.
In particular, that third paragraph really sums it up.
(Okay, all done, I'll go back to posts about photography and Anthony...)
As I mentioned in my previous post, my Mastering Regular Expressions book was just reviewed on Slashdot. One thing that struck me in reading all the resulting comments was the (several different copies of an) apparently famous quote that goes something like:
“I know, I'll use regular expressions.” Now they have two problems.
It's apparently quite well known, so it floors me that this is the first I've seen it. Despite being a manifestation of the ignorance discussed in my previous post, I can certainly appreciate it for its wit.
This quote is generally attributed to Jamie Zawinski (an early Netscape engineer) from a post on the comp.lang.emacs Usenet newsgroup. Unfortunately, there's never been a comp.lang.emacs newsgroup, which renders the whole attribution suspect. Oddly curious about it, I did some digging...
It turns out that the source of the mistaken comp.lang.emacs reference was a May 1998 comp.lang.python post by Fredrik Lundh, who used it as a cute quote in his sig. He's a prolific writer so the sig appeard wide and far, and many people picked up on it and started using it themselves, and the fame of the quote (along with an incorrect attribution) spread.
But it was indeed Jamie Zawinski who first said it, in a Usenet post on August 12, 1997. Unfortunately, it seems that Google Groups, which holds a repository of Usenet postings going back a thousand years, does not have this particular post in its database. If it would, the post would be at the end of this broken link.
[UPDATE Feb 2013: it seems now that the link is no longer broken; Jamie's original post is there]
[UPDATE Jan 2007: Jamie Zawinski added a comment to this post]
Actually, it seems that none of Jamie Zawinski's posts in the threads that spawned from this initial post are in Google's database. This is quite odd. I have been able to find parts of his posts quoted in the replies of others. It was a heated thread, so there's plenty to go on.
There was a thread in comp.emacs.xemacs and alt.religion.emacs in which the idea of embedding Perl into Emacs was proposed. (My first-thoughts comment on this idea is that it's fairly silly, since Emacs already has a powerful lisp interpreter in it. Lisp is odd, though, in that it's a vastly more regular language than Perl, yet arguably less readable. But I digress...).
The main goal of the guy making the original suggestion was to get better regular-expression handling into Emacs. Perl treats regular expressions as first-class language features, making them a breeze to work with. They're not that hard to work with in Emacs Lisp, but in any case, Emacs's regular expressions are much less powerful and have a syntax that's even less readable than that of Perl's, if that's possible. (If you look up “toothpicks, scattered” in the index of my book, it brings you to a page about Emacs regular-expression syntax. :-))
Two things must be understood about Jamie Zawinski when evaluating his comments about this idea: he despised Perl, and he spent a nontrivial chunk of his life tending to Emacs and its lisp system, to the point that he considered it a religion. I can understand and appreciate having that kind of passion about something. In any case, talk about embedding Perl into Emacs would be heresy of the highest order.
It seems that during the course of this thread, Jamie referenced a three-month-old post by Kelly Murray in which Kelly sarcastically suggests something even more outlandishly silly (treating all data simply as a stream of bytes). Apparently, Jamie didn't realize that it was meant to be humorous or sarcastic. Combining this with the idea of embedding Perl into Emacs just for its regular-expression handling, and it was enough to put him over the edge, and Jamie lashed out:
You are trying to shoehorn your existing preconceptions of how one should program onto a vastly different (and older, and more internally consistent) model. I suggest your time would be better spent learning and understanding that other model, and learn to use it properly, and learn what it can and cannot do, rather than infecting it with this new cancer out of ignorance. The notion that everything is a stream of bytes is utterly braindead. The notion that regexps are the solution to all problems is equally braindead. Just like Perl. Some people, when confronted with a problem, think “I know, I'll use regular expressions.” Now they have two problems.
Jamie really disliked Perl, and in the ensuing discussion had a few other comments about it. In this snippet he responds to a “what's wrong with Perl?” question:
> What's wrong with perl? It combines all the worst aspects of C and Lisp: a billion different sublanguages in one monolithic executable. It combines the power of C with the readability of PostScript.
(I also appreciate that last sentence for its wit.)
A few days later it became clear that it's not only Perl itself, that he's upset with, but how he perceives it's often used:
Perl's nature encourages the use of regular expressions almost to the exclusion of all other techniques; they are far and away the most “obvious” (at least, to people who don't know any better) way to get from point A to point B.
Mind you, he's keeping a fair mind about himself, allowing that Perl has some merit:
Perl is not *all* bad; just mostly
I find that the next statement is quite telling:
Maybe Java will save the day, once someone straps a Java front end onto the gcc back end.
Later, he says:
The heavy use of regexps in Perl is due to them being far and away the most obvious hammer in the box. The heavy use of regexps in Emacs is due almost entirely to performance issues: because of implementation details, Emacs code that uses regexps will almost always run faster than code that uses more traditional control structures. Based solely on how lame the syntax is, and how generally unmaintainable regexp-based code is, Perl would be very close to the bottom of my list of choices for most tasks.
I'd agree with that first paragraph if the “most obvious hammer” phrase were changed to “most appropriate hammer”, because Perl is often used for advanced text processing, and that's exactly where regular expressions shine. That being said, I've written plenty of system tools in Perl that are mostly or completely devoid of regular expressions. I use them when they're the best tool, and don't when they're not.
Anyway, it was a colossal waste of time for me to track this all down (and for you to read this far :-)), but once I got on the trail it was hard to get off.
As cute as the “now you have two problems” quote is, it seems that Jamie wasn't the first to come up with the idea. The same quote (but with AWK rather than regular expressions as the punch line) shows up in the sig of John Myers post from 1988, where he credits a “D. Tilbrook” for it:
Now, they have two problems.” -- D. Tilbrook
I've also seen the AWK quote credited to a “Zalman Stern” (in this 1993 post of quotations on alt.quotations). [Update Dec 2020: see below for more on “D. Tilbrook” and “Zalman Stern”]
As Mark Bessey notes in a post on his blog, it's an all-purpose joke. I can imagine that it was first used in by servicemen during WW2, along the lines of “Some people think `Let's ask the officers'....”.
UPDATES:
January 10th, 2007: this post made it high enough on reddit that it made my pageview jump by a factor of 10, and in doing so, brought in comments with more details on the history of the phrase than I had been able to unearth myself. Excellent! See the comment section for details.
January 15th, 2007: Jamie Zawinski himself commented on this post.
December 16th, 2020: This post was referenced on Hacker News, and in those comments users Jim Westergren and dang referenced other times this post has come up and resulted in interesting comments (in 2010, 2013, and 2015). The middle reference was supplemented with a “with a great top comment” note, so I took a look and indeed the top comment by Mike Schiraldi is interesting:
There's actually a little more to the story than that. On 2007-01-09, I wrote to David Tilbrook:
Hi David .. I came across a web page
(https://regex.info/blog/2006-09-15/247) investigating the source of the
following quotation:
"Whenever faced with a problem, some people say `Lets use _____.'
Now, they have two problems."
The author of the site seems to have gone through a lot of trouble to
hunt down the original author of the quote. The best he was able to do
was discover a Usenet sig from 1988 attributed to "D. Tilbrook."
I was wondering if this was you -- if so, I think you should contact the
author to set the record straight. His post was recently linked from the
news aggregator site Reddit, at
http://programming.reddit.com/info/xlov/comments and quite a few people
have been reading the story and discussing the quote.
He wrote back:
I can lay claim to being the author, but I cannot remember when or where
I first used it.
Zalman Stern worked for me at CMU so may have quoted me, hence the
attribution to him.
Actually one of the funnier incidents regarding my "famous" quotes was:
"Software is the only business in which adding extra lanes to the
Golden Gate bridge would be called maintenance" -- David Tilbrook -
circa 1981
I was at a meeting when the speaker used this quote and attributed it to
David Parnas -- I was appropriately indignant.
-- david
P.S.: Do we know each other?
The answer to his postscript was no. 🙂
And he later replied again to add:
By the way, I think I coined the phrase at a European conference in
Dublin circa 1985.
I was talking about the difficulty maintaining portable software when
supposedly "standard" tools (e.g., awk(1)) differed from system to
system.
Then later someone pointed out to me that it was appearing in various
signature lines which I suppose led to its being spread.
I forwarded it all to Jeffrey Friedl (the author of the linked post), but I guess he figured the comments already did a good job covering the story, or maybe he wanted to get explicit permission from David to repost the emails but never got it. But I think David's reply is interesting and compuhistoric enough that I don't want it to die in my GMail archives -- and so I'm posting it again here.
My initial reaction to reading this comment in 2020, 13 years after it was written, was that I must not have received that email, but there it is in my archives, with permission from Mike to repost it here. I guess I didn't get around to it then, doh! Uh, better late than never?