Archive for July, 2009

Automatic Assessment in Learning Games

July 27th, 2009 No comments

I was asked a really interesting question this morning. I’ll quote it verbatim:

how does assessment in the game works—or better yet, you were explaining what’s special about the assessment in the game, what would you tell the layman (i.e., someone who isn’t involved in serious gaming or academics or computer science or engineering)—that is, someone in small business.

For someone who has been leading the assessment effort at DISTIL for almost the past two years, this was surprisingly difficult to do. It took me forever to understand what serious games are, and then forever to make the link between learning and gaming and then to understand the requirement of assessment, and the distinction between learning-focused assessment and certification-focused assessment. Then came the phase of trying out different techniques to see which one worked. For the record, I’ve implemented everything from asking the students their opinions, to sophisticated machine learning that can build models of the type of biases and ‘plans’ that the user employs during their problem solving.

As always, the correct level of engineering is somewhere in the middle between the trivially simple and unreasonably complex. In the last game that I was working on, I employed the analytics capabilities that underlies the data preparation phase that takes place prior to the application of machine learning classification models, without going so far as to build the actual statistical classification models.

So what was the answer to the question? Well, here it is, in the required two sentences:

The assessment in the learning simulation employs sophisticated analytics that generates learning reports based on actions that the student chooses to make in-game. This is a much more relevant criteria for assessment feedback than the traditional alternatives of asking the student their opinion (via smile sheets) or by triggering canned reports based on the final outcome of the simulation session (which hide details on the efficacy of choices made in-between).

The technology itself is based on modern analytics and event identification methodology, which is the same insight-generating tools that have transformed the field of marketing and catapulted modern corporate giants (such as Google and to great successes by literally enabling them to ‘read the mind’ of their customers based on their actions and thus serve them better.

So what exactly are analytics, and why have their allowed firms like Amazon and Google to prosper. An analogy is quite useful here, once again from the field of marketing. In the days when dinosaurs roamed the Earth, marketing efforts consisted of purchasing media space, whether it was a printed advert in the papers, a timeslot on TV or sticking a billboard on a hill near a major road. The hallmark of all three techniques mentioned was that you never actually knew if the advertisement was seen, and even if you could approximate the number of exposure (which is defined as the instances when the advertisement could have been potentially seen by a human nearby), you never knew the actual conversion rate (which is the instances when the person seeing the advertisement took the desired action, i.e. bought your product or gave your sales-team a call).

Nowadays, on the web (and the other rich media), not only can you specifically measure every instance of your advertisement being seen (i.e. the exposure), but you can track every conversion event that takes place (i.e the user coming to your page, interacting with your shopping cart, checking out, and making the final payment). You can also calculate the conversion ratio for each step, and employ closed-loop feedback to judge the impact of changes that you make to your advertisements, to you content and layout, and the online sales process. All changes initiated by you will impact user behaviour and shift key conversion and sales statistics.

Surprisingly enough, the same techniques work remarkably well for judging the competency level of individuals in learning games. As long as you can identify the key events (akin to that of conversion in online sales), you have useful checkpoints by which you can judge the student’s competency. Best of all, this can all be done without the active intervention of an instructor, which can really boost the relevance and applicability of eLearning and distance learning platforms and free the instructor from the day to day operational drudgery of assessment to focus on the actual crafting of better content, and better assessment. Perhaps this was the reason that DISTIL was conferred the best product award at DevLearn ’08, which is the premier platform to judge the leading innovations in eLearning in North America.

This innovative form of assessment that we developed assuaged a key pain point of eLearning instructors and instructional designers. Anyone who has taught courses either in academia or in industry knows about the wasted days checking exams. For them, this automatic assessment is definitely a welcome relief. It is also a lot less threatening for the student, and does not provoke the same level of anxiety that formal exams elicit. Best of all, it can directly support criteria-referenced assessment, without devolving into a norm based examination (which happens with many other exams) as the actual choices are tracked, rather than the way these choices are expressed.


On Architectural Creativity (Lessons from Game Design)

July 26th, 2009 No comments

I’ve spent the past two years working in a very interesting work environment, surrounded by a very innovative and talented group of individuals. For the record, our mandate was nothing less than to completely transform learning from a dull, staid, painful and uncertain process to one that was engaging, immersive, fun and robust. I’ve worked on a good half dozen learning games in this period, where we essentially took learning objectives that would normally have been the focus of a PowerPoint deck, wrote these up in the form of a curriculum, and brought them to life in the shape of interactive learning simulations.

There were three distinct groups that supported the effort of designing and implementing these serious games:

  1. Designers and managers who ensures that the learning, game design and assessment/analytics were meaningful
  2. Art, sound and content creators who took the designs and reified them
  3. Software engineers who built the underlying interactions and processes, and stitched together the pieces of the games into the final product.

Obviously, I’m missing many people in the cast of thousands here, there is no mention of product/portfolio management, quality assurance, customer-service, sales etc. These are all support functions, and although they are critical to the success of the firm, they do not actually directly create the game. They may exercise creativity during the course of their work though.

I’ll talk a bit more about my personal role(s) in this process at some other point, however the key point I’d like to discuss in this article is creativity. I had numerous opportunities to observer creativity (and the lack thereof) in action while working on these games, many of which rivaled Hollywood blockbusters in their complexity and the ambition behind their vision.

What I realized recently, after two years of observing some occasionally bewildering behavior, is that creativity is all about how you deal with ambiguity. It can be a truly counter-intuitive process by which you first introduce ambiguity (by adding choices) and then eliminate it (to arrive at the final solution). As a part-time machine learning researcher, I can even point out a class of algorithms where computer software ‘practices’ creativity to learn and model sometime very useful ‘solutions’ in the form of patterns. These bootstrapping algorithms essentially start with a seed set of rules (or examples) and ‘invent’ patterns derived from new data available that have a decent likelihood of forming a rule in the desired classifier. However, the ‘creative’ process does not end there, as each of these ‘candidate’ pattern rules need to be evaluated against what is known (via the seed set or other identified ‘ground truths’) in a process that ruthlessly eliminates most of the patterns. Interestingly, human creatively works in a very similar way.

The individual most successful at the creativity game keeps the end-goal of the product firmly in their mind, while they explore the many options available to achieving this goal. Thus, the most creative individual will bring in ideas from other fields (which may not be readily apparent to others) and practice lateral thinking.

There are many celebrated successes of lateral thinking around us. Consider ‘object oriented design‘ in software engineering which is based on the successful application of component strategies in electronics and hardware. In hindsight, it makes absolute sense, build components that hide all the complexity within, and you can afford to reuse and repurpose software, and create sophisticated designs (by assembling mature components) that are much more ambitious than would be humanly possible if you were building an completely integrated software. This is a key reason why Intel’s CPUs, which were originally designed to abstract out common processing function for calculators and traffic light controllers (which were quite surprisingly Intel’s specialization in the early years), now form the ‘brains’ behind computers throughout the world. Some ‘creative’ engineer figured out the absurdness of reinventing the processing component for every new project, and eliminated a complete waste of resources by serendipitously architecting this as a separate module.

However, it is almost important to emphasize at this point that the creative process does need to be focused towards the end-goal as well. Architects (who are undoubtedly creative people who need to deliver working products) speak fondly of the phrase ‘form follows function’. Indeed, the most classical architect broke this down into three distinct areas, and gave us a very useful set of evaluation metrics. Vitruvius argued that any architecture (and the resulting product build) can be measured based on its utility, robustness and aesthetics (although Vitruvius referred to this as Commodity, Firmness, and Delight in the original text, which can be a bit confusing). The same metrics work wonderfully well for modern software, which can range in quality from a veritable tower of cards to the Taj Mahal.

Although it is true that the first solution you think of, is quite likely to be successful (thus proving the utility of early commitment in hostile environments), it is important to consider multiple paths to your end-goal if your role is that of a creative. Generally, you have the luxury of more time than firefighters and hostage negotiators, on whom the early commitment studies was based. It is also much simpler to make changes early in the process while dealing with paper prototypes than when you are stuck with the sunk cost represented by semi-built buildings or half-developed code. I would argue that a project where the entire focus changes mid-way, the initial model is nixed, and a ‘reset’ takes place is one that has been designed by an intellectually lazy architect, who takes pleasure in letting others ‘live out’ his or her experiments. It’s a dangerous practice that is insensitive to the people who are paying for the project, for the customers who need this product and the team that is living out the development of this solution and can literally burn-out from lack of perceived progress.

For creative individuals, there is an important tradeoff between keeping your options open, and delivering a useful end-product. You absolutely need to iterate through various architectural options while designing the final solution. However, when it comes to the final push to implement, it is absolutely critical to have a design already nailed down — the commitment should have already taken place. This design may not be detailed at the level of every piece of content or code required or outline all necessary activities, however there should be a clear acceptance criteria that specifies the form (in terms of utility provided, robustness/operating environment supported, and visual manifestation) and the underlying event processing model (which is the game designer’s equivalent of the Hollywood narrative script).

There should not be any doubts about the commodity, firmness and delight anticipated by the availability of the final product at this stage. Any deviation from this axiom reflects a lack of responsible behavior, or an an abdication of design ownership. In these scenarios, you may want to consider resurrecting the ancient pirates tradition of overboarding and make the architect walk the gangplank.

Software Practices: Agile vs. Waterfall

July 21st, 2009 No comments

For the past couple of years, I’ve been mostly working with agile practices. It makes absolute sense considering that in most of the project I work on, we don’t even know if a solution exists for the problem that we are tackling. It makes complete sense to identify the final goal, and work via iterations to satisfy the acceptance criteria that we’ve set.

If anyone asks me when to choose an agile development practice, and when to work with a more traditional waterfall method, I’ll ask them one question that should help resolve this quadrancy. the question is

Do you know enough about what you’re implementing to create an end-to-end design, and identify every step you need to take towards this goal.

If the answer is yes, the best option is the waterfall method. You, my friend, have a clear, crisp engineering project that can be predictably tackled and without any risk of scope-creep. Why not enjoy the next year with a bit of up-front planning. By this point in your career, you’ve probably earned the right to sleep at night knowing that things are on track!

If the answer is no, I’d nudge you more towards agile practices. Put into place weekly/bi-weekly meetings to set the goals, and remain cognizant of the deadlines, and have liberty to chop quality of functionality as you run out of time. It’s all about ‘dead reckoning’ baby! You’ve probably lost sight of shore a long time ago. []. Your project is staffed by a passionate team that works well together, has open and clear communication, and is focussed on gitterdone.

After all, you only have a fuzzy idea of where you and your teams wants to get to, and have to use frequent meetings with clients and the team to ensure that you’re on course (and moving in the right direction).

I can only promise you that you’re in for one hell of a ride, and would not want it any other way. After all, Niel Young said it best.. ‘it’s better to burn out than fade away!’

You can learn a bit more about the difference between incremental and iterative development in the aptly titled post:  The waterfall trap for “agile” projects

If you liked this post, why not subscribe via email, to be notified of other similar posts in the future?


Monetizing Blogs

July 17th, 2009 No comments

Sometimes, the rules of real life apply very solidly to the etherial world of the Internet. I recently read about how Bankaholic was acquired for $15 million. Not bad for a blog run by a single individual (Mr.Wu). It made me wonder, why would have BankRate paid such a hefty price for this Blog? I did a bit of digging around, and noticed that the key is lead generation. Here is a quote from Mashable:

“Just in last three years alone, the prices for some of the bids have gone up 200% to 300%,” said Chatter. “As an example from Google, to be one of the top three advertisers for ‘high yield savings,’ you have to pay $13.20 for a click. With a 1% conversion, it costs you $1,300 to acquire a customer.”

“Those are the numbers behind this sale – essentially, it’s lead generation.”

Now, all leads are not equal; banks are quite happy to pay oodles of money to acquire customers, who represent essentially a lifetime flow of revenue for them. A really ridiculous example is that of Asbestos Mesothelioma. You’ve probably never heard of this, but lawyers in the US are willing to pay over a thousand dollars a click for this keyword. This is because, people who are searching for this rare disease are quite likely to want to sue their former employers for oodles of cash. Lawyers do like that, very much, especially if they can take the case on a contingency basis and keep 35 – 50% of the net proceedings, on top of their legal fees.

So there you have it gentlemen, if you’re looking to make some cash via your blog, you need to enable advertising, and focus your content on information that commercial entities are deeply interested in. I don’t envy Mr.Wu of Bankaholic, who spent 18 months writing about bank mortgage interest rates, but he obviousely chose a niche that worked out well for him (and which contained paying customers). It’s eerily like your choice of location for your business; if you choose to run a restaurant down-town in the most fashionable area, you’ll probably end up with lots of customers in the daytime (around the offices) and the evening (around social activities). Move elsewhere, and you may have more people in the day than at night or vice versa. If you choose the wrong location, you may have to shut down due to lack of business.

If you want to learn more about particular keywords that your blog may want to focus on, try out the Google Sandbox. It’ll tell you about traffic as well as advertiser competition.

Security.. what me worry?

July 14th, 2009 2 comments

It’s been a watershed moment for our startup over the weekend. I got the components for the server from and put it all together.  It’s a sexy system,  i7 CPU, 8cores, 12 GB DDR3 ram, lots of cooling,  RAID5 configuration with terabytes reliable drive space. It literally floats an inch in the air through it own power, and emits a wicked blue glow. This is more powerful than a dozen of the servers that Google started out with. It only set me back around $1,700, and is a performance machine architected to withstand loads of queries.

I’ll get this machine collocated at a managed site later (once we get some revenue coming in), but for now, I have it in my basement, and accessible via dynamic DNS.

I was super-excited to get started and created three accounts for my collaborators and mailed the info out. Within 10 minutes, I noticed using the ‘w’ command that one of my collaborators had logged in. I tried to talk to him via the console ‘talk’ command and my attempt to talk was refused. I figured that it was a configuration issue and fired up Skype to talk to him. The conversation went as below:

[10:25:56 AM] Shahzad Khan says: Hi AL
[10:26:04 AM] Shahzad Khan says: How is the connection to the server?
[10:26:11 AM] Shahzad Khan says: Is it at a workable speed ?
[10:26:49 AM] AL says: Oh, I had not even seen that e. mail.
[10:26:53 AM] AL says: Let me check.

Ok, at this point, I’m perplexed. What’s going on? He’s already logged in!

I double-checked, and another of my collaborators was logged in. I now fired up the ‘who’ command to see where they were coming in from. Well, they were apparently no longer in Ontario, and were coming in from Spain!

In a blinding flash, I realized that 10 minutes after setting up the servers and accounts, we’d already been hacked!

I booted the ‘unwanted guests’ off, and changed the passwords to the AOL style pw, rather than the throwaway ones that I shared with my friends. Paranoai is my new watchword now ! I haven’t been hacked in 14 years, since the core wars that used to take place between the IRC junkies… and that was all good fun among friends. These hackers who tried to hijack my previous server are professionals. They’re either scanning the block of IP’s that my ISP uses for their DSL customers, or the Dynamic DNS server that I employ to keep my server’s name updated for the world. There is a standard going rate for ‘bots’ and ‘smurfs’ that are harvested this way, and my poor server was about to be kidnapped and sold into slavery.

My mistake was to believe in ‘security by obscurity’ and not worry about the strength of the credential. It’s only my instincts on ‘normal’ server operation (and noticing something amiss) that saved us this time. Next time could be messier. I noticed that only the two accounts with usernames that are common were compromised. The other accounts were not broken into, as the username was different. This leads me to suspect that this incident was the outcome of a plain vanilla dictionary attack on my citadel. Oh the shame of it all!

How’s that for baptism by fire? 10 minutes! Real life can be so brutal.

Architectural changes in marketing

July 8th, 2009 No comments

This is a guest post by Glenn Schmelzle, who is contributing his analysis on the new opportunities in marketing that follow from the paradigm shifts sweeping the information technology landscape.

OK, so marketing isn’t known for its use of cutting-edge tools. But as someone who’s been handling the tools for 15 years, I have noticed big changes in the technology that supports the buying and selling of products. These technologies have made life easier for both sellers and buyers, but I’ve deliberately skewed them because the more they’re used, the more they end up benefitting one side more than the other.

Innovations benefiting the buyer’s side:
Probably the biggest boon for those who need to do their research before buying is the corporate website. Unlike the days of calling an 800 number and getting a brochure or catalogue in the mail (and several follow-up calls!), people can obtain rich detail on a product before identifying themselves. Thanks to XML, price-shopping sites allow them to compare competing products. Sellers aren’t fond of these developments, because they have to divulge a lot of information, but market forces give them no choice.

As email became the dominant communications mode for B2B interaction, it began to be accompanied by a terrific tool: the spam filter. This innovation, more than legislative restrictions, has put people in control of their inboxes. They are preferred by any principled marketer and very feared by any spam artist. Sure, there’s room for sellers to send a one-time-only inquiry as well as opt-in based emails, but in the end, you can choose who to maintain relations with on email.

Social media is on the rise as a buyers’ tool. Its chief use here is to connect with others to share information on products without even consulting the product’s makers. There were earlier iterations of this like TripAdvisor and Epinions, but the newer crop: Twitter, Facebook, Techcrunch and the blogosphere have put the web’s usefulness as a third-party opinion tool into overdrive.

Innovations benefiting the seller’s side:

I think CRMs have had the largest impact in recent years. Whether local or in the cloud, private or open-sourced like SugarCRM, they are great for letting everybody in a company toss what they know about the customer into one bucket. The resulting profile gives a picture of prospects that is much more accurate than ever before. Here’s one example of how CRMs and direct marketing techniques have helped: You used to receive new product promotions in proportion to the product’s revenue forecast. If you weren’t part of the audience it was meant for, tough! Now, you are (usually) receiving promos for items geared for you. The fact that you receive more of these messages is a direct result of the mushrooming number of products on the market; it’s not marketing’s fault.

The beta deserves mention. No, not the VCR format that duked it out with VHS in the ’80s. Few technologies today are launched ‘cold,’ most are pre-released to power-users. Everything about a product can be crowd-sourced today…and it’s a good thing. The vehicles for leaking info (and code) on new products have also exploded in use. WebEx, AppExchange, sourceforge and Amazon’s EC2  have all dramatically reduced the cost of sharing work-in-progress with potential buyers. This ultimately lowers risk for the seller; no one wants to have another ‘New Coke’ fiasco on their hands.

Creating documents in Adobe format has been a significant development in marketing. As the P in PDF indicates, it’s made print-quality collaterals extremely ‘Portable.’ This has not only eliminated printing and mailing costs, it has produced instant gratification and reaction from buyers on the content of those documents. If buyers don’t react well to the collaterals, marketers can re-write and re-publish them in no time.2016 balklänningar Line. Denna samling erbjuder ett brett utbud av vackra stilar, så du kommer att vara säker på att hitta din perfekta klänningen. Dessa hänförande balklänningar varierar i stil. Så du kan vara sexig, klassiker, sassy, våga, och så mycket i en av dessa fantastiska klänningar! Det är en klänning för alla, ser till att ge en smickrande stil för alla

Finally, I’ll mention the umbrella category of Business Intelligence tools, that includes web analytics, email measurements and social media monitoring. These are all means for marketers to understand what works – this data was nigh unto impossible to have in pre-Internet days. Marketing Automation tools like Silverpop, Marketo and Eloqua are now giving unprecedented visibility into the sales funnel. This holds the promise of tightening sales forecasts and informing executives specifically how well their tactics are working.

To conclude, these tools have been fantastic as they’ve forced buyers and sellers to rethink how they do business. Together, these have all helped buyers and sellers reach out to each other. I think they’ve supplanted old, crude, disruptive marketing methods. They provide a great indication of how far we can go in the future, although knowing exactly where innovation will happen next is anyone’s guess.

Glenn Schmelzle is a technology marketer and worked with Shahzad at an Ottawa-based startup. He can be reached at

Architectural changes in web applications

July 6th, 2009 No comments

For an ‘old-timer’ like me, who witnessed the birth of the web, and the adoption of the Internet, it’s been a challenge to unlearn some ‘rules-of-thumb’. I’m listing some of these as food-for-though for others who follow a similar technical path.

Moore’s law has mutated. Technology is no longer about boosting speeds and capacities. Gone are the days of the break-neck races between Intel and AMD to achieve higher Gigahertz in their CPUS. The new reality is all about parallelism, multiple-cores, caching layers in architectures (typically via memcached), formal and informal means of splitting data across multiple machines (i.e. sharding, load-balancing, map-reduce). Any non-trivial architecture that requires massive scalability has to build in the capability for synchronizing across distributed server components:

Assemble battle-tested components, rather than build a proprietary stack. I’m surprised that people who are learning to program are still taught to use linked-lists, and spend time at the data structure level. Most developers will never need this granularity of understanding, and will simply plug in the data structures from the C++ Standard Templates Library, Java Collections Frameworks, or whatever language they prefer to use. Obviously, this low-level knowledge is very useful if you’re working in an area that needs it, but frankly, the majority of developers do not need it. The existence of Service Oriented Architectures actually makes it possible to ‘plug-into’ remote processing capabilities that are no longer even managed by your team. Cloud computing has also taken this to another level. Amazon’s EC2 is not a bizarre anomaly, but a celebrated part of the mainstream now.

Database normalization is passe. There was a time when people bragged about how normalized their DB was. It was a time when purists reigned. Nowadays, unless you’re tracking the world’s financial data, you don’t need that level of normalization. It’s a sign of a confident developer, when they purposefully denormalize parts of their database, to speed up the database access, and reduce the burden on their server. It is possible to do this without running into excessive redundancy, stale data, and integrity problems. The art is in knowing how!

REST is available for almost free. There are a number of development frameworks that allow your web application to be offered almost immediately using the SOA model. Sure, you’ll still have the human web interface, but by compiling on RubyOnRails, you get the ability for others to query your web applications as if they were remote components in their system. This reduces the interface rendering processing, and allows for collaborators to develop reliable system that are integrated-via-contracts to your system.

These are the most interesting paradigm changes that have taken place in web architectures. Any comments on other shifts that I may have missed?

The rebirth of hope

July 3rd, 2009 No comments

I’ve seen this all happen before. It comes with the territory. Those who live in the startup eco-system, have to be convinced that their superior strategies can help them out-maneuver larger, established ‘dinosaurs’. In this eco-system, you always live between hope and fear, convinced that you can make the next big thing, but thoroughly aware that you’re seriously resource constrained.

Your friends are all working for universities, 3/4/5 letter government agencies, eBay, Google, Microsoft etc. They have secure positions, which could be yours as well if you simply take a few steps in the right direction. However, you have to have a reason to stick to the knife’s edge, standing out in the elements, ready to ride the market forces using your nimble, low-cost, disruptive business model. The 5 weeks a year holiday, and the full pension do not excite you. The gold watch and the retirement are false idols that are not for you.

Well, as a good friend of mine use to say ‘never mistake a clear view for a short distance’. Sometimes, the founder’s vision is spot-on, and you have to cross the desert, sign on to the Black Pearl, and spit in the face of conventional wisdom to change the world, invent the state of the art, and and boldly carve your initials on the structure of the marketplace.

Other times, you weaken when crossing the desert, and the big dinosaurs you were sneering at, are comfortable in their established watering holes, while you wander down the unbeaten path, and are slowly worn down by the elements, the economy and perhaps by your more nimble, faster-reacting, or less ethical competitors who take chunks out of your capital and clients. Have internal conflicts in the firm, be late to the market and face difficulty in user adoption, and things get worse.

The end result of the futile trek across the desert is ‘the end’. A time comes when you run out of cash… and then you run out of believer-in-you who are willing to give you more cash to keep the dream alive (usually for an insanely huge share of the firm; no free lunch remember!).

Then the enterprise has only one way to go, tits-up, and the painstakingly put together processes, hard-won camaraderie and the vision that’s 90% of the way there ends up snuffed out. There is no white-knuckle corkscrewing towards the ground at 300+ mph, there is no spectacular flame-out, or a ‘final battle’ in which the dice are rolled, and the last desperate conflict leads to ultimate victory, or abject despair. All you have is the march towards the inevitable; you know it’s 3 months away, then two months, then two weeks, and then it’s today, and then you turn in your keys, clear out your office, say your goodbyes, and head home to think; and figure out whether you have the finances to strike out towards your own goals.

After all, you’re marooned in the middle of the desert, far away from the comfortable watering holes of the familiar economy, with the shadow of the vision glittering just out of reach in the horizon. If you’re out of luck, you throw yourself back into the mundane, and trudge back to the familiar. If you’ve still got a spark left, you recruit a new crew who shares your spirit, and then decide if you want to adopt the vision which was just out of reach as your own, or maybe, just maybe you spot something else that is even better now.

After all, while you’re been marking your way through the desert, you’ve learned, improved, and have many new notches in your counting stick. You’re not the same person who started this adventure. There is no shame in trying to achieve the impossible, if nothing else, you learn and improve, and can one day achieve your dreams.

However, the last get-together is quite bitter-sweet. The people who were with you day-to-day are not just your colleagues, they are your friends, and the espirit d’core lives beyond the venture that folded. After all, this is an intense environment, and the people with you have the same spark as you; otherwise, they’d either never have left the safety of the shores, or they would have not have stuck out as long as they did.