Venturing into Venture Capital: Part 1

It all started when Mondy (aka Subhajit Mandal, financial trader from Singapore and the loudest in our team) and yours truly decided to contact the VCIC authorities individually about our interest in participating in the competition. For the uninitiated, VCIC allows Business School teams to play the role of a Venture Capitalist, analysing real startups, complete with due diligence and partner meetup rounds. The VCIC folks were slightly befuddled by two individual queries from the same school and connected us together. Hence, both of us set out to find the perfect team to enter the competition with. Long story short, we found the team in Ricky Wong (IPO specialist from Hong Kong and a really heavy (pun intended) drinker), Jose Antonio Borrero (Ecuadorian Project Financing specialist and a hardcore party person), and Pornteera Pawijit (Biotech PHD and Thai, enough said).

And so we lumbered through January and February, meeting each other as much as the hectic business school schedule allowed, preparing for a subject that we were all very passionate about (Venture Capital) but knew very little of. After talking to a bunch of seniors, we started to unravel the mystery clouds shrouding VCIC, and ended up finding ourselves in a meeting room in Nanyang Business School, waiting for the regional round to start.

In the regionals, we were analysing 3 startups: QSearch, EduMatters, and 3DPrinter. And so the due diligence round began, with us interviewing the 3 startups for 15 minutes each. While we may have been a bit aggressive, we got answers to most of the questions we were looking for. Although we liked none of the companies, we had to choose one nonetheless, and after much deliberation (involving some scuffling and tiny blue bruises), we chose QSearch, an automated marketing and analytics startup. The company was in popular demand since barring one, all the other teams had also chosen QSearch. This led us to the negotiation round where we were trying to bring down the valuation of the company to less than one-third of the founder’s valuation. Mondy and Ricky came up with a clever financial structure (not that the judges liked it so much in the end, but it gave us an edge for sure) for the term sheet and up we went against a very professional 9-times serial entrepreneur. 10 minutes into the session and I am proud to say we had the founder completely agreed to our logic of giving lower valuation and sealed the deal. We did miss a few points, which I’ll bring up in the next part.

Lastly, we entered the partner meetups with the judges. The judges in VCIC are top-notch Venture Capitalists and Angel Investors, who role play senior partners in your VC firm and judge you, among other things, based on the deal you present to them. We could answer most of the questions thrown at us by the partners, most of them revolving around the investment opportunity, our rationale for the investment decision, and the deal structure itself. We were berated for not diving deeper into the team aspect of the company and finding the real motivations of the founders. Taking this with a pinch of salt, we sat fingers-crossed for the final decision of the judges. Delighted to say, we won (!), and also got an award of $1500.

Post the competition, we had a networking session with some of the judges that turned out to be a great way to interact with the investor community of Singapore.

Next Stop: University of North Carolina, US, for the global finals!

The blog post was originally contributed by me on the NUS MBA blog and is reposted here. Please drop in your comments below or get in touch with me on nikhilkapur at outlook dot com.

What will you like for lunch today Mr. Entrepreneur: BBQ Wings or Sambal Chicken?

I know this sounds like a culinary post, but it is not. It is about much more serious stuff and sadly not as tasty. Countless people talk about the Silicon Valley ecosystem and venture capital everyday. Much has also been talked about the Asian counterparts, be they in India, China, Indonesia, or the little red dot where I currently live. But, I haven't seen too many people compare the two. I have already interacted with so many VCs in India and Singapore thanks to TommyJams. Last month, I also got a chance to visit New York and North Carolina for a Venture Capital competition (more about that in another post), and rubbed shoulders with some of the more influential VCs from the Valley. This is what I found:

1. Easy Say Easy Do

US VCs are much more willing to take the plunge, even further than the entrepreneur, as long as they see a good deal. Asian VCs think, parry, take a step forward, take a step backward and then finally jump. Imagine a lake with crystal clear waters and fairly deep. A US investor will strip off and jump. Whereas an Asian VC will bring out his thermometer, check the temperature, check his own blood pressure, wear an oxygen cylinder on the back and then jump. Not saying that this is good or bad, maybe the Asian system is just not built enough for the investors to take easy risk, but here the difference lies.

2. A friend in need is a VC indeed

US VCs are much more concerned about their relations with the entrepreneur and whether the entrepreneurs perceive them as potential partners than the Asian VCs. In Asia, the investor still believes that if he/she brings in the money, the startup has
to just deal with his/her idiosyncrasies, whereas in US, the investor is fighting against so many other players that all he can do is to put on a friendly-chap mask, cross his fingers, and hope that the entrepreneur chooses him. Hopefully, the rising deal competition in Asia will drive similar results here too.

3. Get your hands dirty before you bake a cake

Barring one small VC firm from Delhi, none of the investors ever talked about how they can help us with our company. It was a shock to me that EVERY and I repeat EVERY investor from the US started investment conversations with this topic in my mind. Again a result of the higher competition, but the difference is striking. The concept of "Smart Money" is yet to evolve, at least in India.

4. Cool is the Word

Refer to this tweet from Jason Mendelson (one of the more talked about VCs from Foundry Group) and you'll get a summary of what the VCs in the US are typically like. Compare that with this channel from a typical Asian VC firm. I am sorry to say this, but only one word: BORING. Bring out the humour guys, entrepreneurs are fun people, and they want fun people around them, especially after a hard day at work. US VCs have learnt this the hard way and Asian ones will have to too.

5. 30 Seconds to Fame
So what is it that the US investors are after? I believe after a few decades of investing in startups, they have come to realise that it might not be just about the returns that they make on their fund, but also about the journey that they go through with the entrepreneur. As an investor, you are the passenger in the back seat, admiring the landscape, and yelling out at the entrepreneur driver when he hits a rut. But in the end, what you want is a fun ride, good music on the radio, and finally reach your destination. US investors, at least the ones I talked to, seem to have learnt this lesson, and asian ones will have to too, sooner than later.

So as you can see (and I just realised this myself while writing this post), I seem to be biased towards the US VCs and with some reasons for it. Does this mean I'll look Westwards for my next fundraising? Probably not. Does this mean I'm in a better position to find the right investor back here in Asia? For sure, yes.

Disagree? Don't like something I said? Feel free to drop your comments or write to me at



It feels weird to revive this blog after so long. I switched away from this blog when I started blogging for TommyJams, and of course it took most of my time away from work, and hence this blog sadly had to go into a coma.

Since then, things have changed much. Eat, Play, Code. I think I do less of all these than ever in my life, but that said, these 3 things are the ones which still make me the most happy. Why then do I not do these 3 things more often? Because, for the last 4-5 years, I have turned into, for lack of a better word, an explorer. I explored the music industry with my startup, explored more than 15 countries in the last 2 years itself, and now exploring an MBA sitting in Singapore as I type. And I don't regret one second of these 4-5 years. These years have opened my eyes to the world around me, and hopefully I can bring that out into the future posts.

So as I write this post and pretend that someone else apart from me might read it too, let's embark on a new journey, and let's hope that this blog stays active, and the exploration never stops.


libvorbis and Milepost

After integrating Milepost into nspr, I found that it was not possible to measure the performance boost there. So Dan suggested that I try the same thing for libvorbis, the ogg library which is used for html5 video. So I started working on that, and as now I'm used to the mozilla build system it didn't take too long to perform the same steps, with minor variations for libvorbis.
Unfortunately, while using Milepost for libvorbis, I am now facing a compilation error on the file vorbis_res0.c. It works perfectly fine as long as I am not using Milepost, but the moment I enable Milepost, it gives a compilation error.
I have informed the cTuning people about this, and a thread is running here. It might just be that its something I am doing wrong, but frankly I can't think of anything. I hope they can give me a solution to the problem soon.
Lastly, I still cannot try checking the perf boost, because the Web Service is still not running properly. I have informed the cTuning people about this too and they are working on it right now. I hope it gets resolved soon, because until it does, I am stuck with nowhere to go.

nspr and Milepost

I haven't posted for some time now, mainly because I did not have anything concrete, but now I am regretting that. There are so many tweaks here and there that I have done in the past 10 days or so to be able to integrate Milepost GCC into the Mozilla build-system that I am finding it difficult to remember all of them. Still, I'll give here a brief sum up of my progress since the last time.

Well it started with a chat with Dan (dwitte). And he suggested that we start work with SpiderMonkey and try and get Milepost integrated there, because the build system of SpiderMonkey is the same as Mozilla. He also suggested that we try and tweak the code written for PGO which also uses two passes on the system. So off I dived into PGO, got tangled up in makefiles and configures and whatnots and then finally had to approach Ted (ted) to make some sense out of all of it :). And then I got to know about which sets the CFLAGS and then all the fog cleared :). So I basically figured how PGO was working, but Ted told me that it had been disabled on Linux. Anyways, it helped me a lot in how to call multiple makes etc.

So I started trying to integrate Milepost into SpiderMonkey when I came across this discussion on the cTuning group which said that C++ is not currently supported and I saw that most of the files I was compiling were cpp, so I had run into a dead-end. So then off I went to Dave(humph) and Ted and they gave me names of some C-only modules and humph suggested nspr, a C-only module which provides a platform independent API for system level functions. So then I had to start again with nspr, although it was quite similar to SpiderMonkey and much simpler to understand :).

After much banging heads against the build-system I managed to understand the flow and I now knew what I had to do. So I created a "milepostbuild" target, similar to a "profiledbuild" target of PGO, which would call submakes, changing the environment variables as required by the ICI plugins of Milepost everytime. It was a huge task in itself to find out how to export the variables to the current shell in the first place but I finally managed. I am currently exporting them initially in configure, and to call configure I am using "source ../configure" in bash instead of a normal "./configure". This sets up my variables in the current shell and now I can make whatever changes I want to these variables while calling make again from the makefile by adding VAR=Val alongside.

This done, I now had to get the ICI plugins to work with the build, properly. For this I needed to add the filename of the source file being compiled by gcc to the files that the plugins create, which basically contain the executed passes (.txt) and the static features (.ft). I couldn't find a way to do this at first and I found myself wandering here and there in the gcc code with no idea at all, but after some searching I found the "function_filename" feature which would be able to return the filename. I used it and it worked fine on my small programs, but it did not work when I used the same thing in the build. Finally, I realised that it was because of the relative path of the filename being returned instead of the filename itself, and then I mended that and voila, everything works now!

So, I am now able to extract the gcc executed passes for each nspr file and its respective static features. I have to integrate the web service in now, to predict the flags using the extracted features, but right now there is some problem with the cTuning web services. Also, there is a slight problem regarding the passes for which I need help from the Milepost people. Lastly, although I am currently trying to make this work on a file-to-file basis, what we really need is something that can work on the whole module in one go, as working on each file, especially using a web service, takes a lot of time and as Dwitte mentioned, it won't scale. Already I see that my build time for nspr has changed from seconds to minutes, when I haven't even used the web service yet. I have posted the same concern to the Milepost authors and hope to receive a positive reply...

Milepost GCC-Web Services

So finally after banging my head for almost two days against the Web Service of cTuning and CCC Framework, I have finally managed to get the combinations correct and am able to receive the Compiler Optimization flags now from the cTuning web services. There are a few glitches here and there, I think it requires that the platform, compiler and environment that one is using should be already present in the database. So I opened up the database and retrieved some records and used the platform, env and compiler ids from there to retrieve the predicted compiler flags and I am quite pleased to say that it is working now. Although, I see that using the flags generated, I am getting a worse runtime than by using normal -o2, -o3 levels :)..

Also, I read on the cTuning website that currently only compiler optimization flags are being predicted by the ML based compiler and optimization passes are not being predicted but there are plans to incorporate this too. I will try to follow up the authors of Milepost and find whether that part has been done, or is in the process.

Milepost GCC - Plugins

Here are some of the plugins that I have installed and a summary of what they do:
  • save_executed_passes.legacy
    This plugin saves the executed passes per function in external files
  • save_executed_passes_with_time.legacy
    This plugin times the execution of the passes split-compilation.
  • substitute-passes.legacy
    This plugin substitutes original GCC pass order with the one read from either external files "ici_passes_function.txt", 1 global file "ici_passes_all.txt" or environment variable ICI_PASSES_ALL (passes are separated by comma) thus allowing external manipulation with passes (adding, removing or reordering).
  • extract_program_static_features
    This plugin extracts program static features per function as vectors and saves them into "ici_features_function.txt"
The installation of these plugins was not very intuitive but I managed by tweaking the code here and there. The plugins are working fine now with GCC. The only problem I now face with Milepost is how to predict the optimization flags using the static program features which are extracted by the last plugin mentioned above. I believe (I may be wrong here) that we need to access the cDatabase and push our features in there, which then returns a set of optimization flags which can be used to compile with GCC. There are two ways to do this: using Web Services or using the CCC Framework ( I have currently tried both but there are issues.

I cannot find a php script using which I can send a request to the "predict_opt" Web Service. Without this, its not possible to use the Web Service. The script for adding the optimization cases is available though.
With the CCC framework, I am unable to access the Web Service using sockets. This is the part where I am currently working. I have posted a request on the Discussions of the Community for making available the php script. Meanwhile, I'll try to get the CCC framework to work and return the predicted flags.
Powered by Blogger.