categories

HOT TOPICS

Subscribe to our Feed

Thought Leaders In Cloud Computing: Alan Perkins, CIO of Altium (Part 3)

Posted on Thursday, Dec 23rd 2010

By Sramana Mitra and guest author Shaloo Shalini

SM: What is the architecture that you are using to build and roll out something like this? I presume that this cloud computing discussion is relevant to this new marketplace, this electronics marketplace that you are developing, right?

AP: Well in part. We were strong proponents of the cloud long before it was called that. A business that is 97% export oriented and spread all over the world has to deal with many more complexities in terms of the business systems. You simply cannot run a business like this, of this size, using traditional software prices. It is just too expensive. So, we have been very strongly and passionately advocating the use of cloud and similar types of technology for business systems for a long time. To us, this is just an extension of what we are already doing. To give you some simple examples, we were adopted Salesforce as a business platform quite early on. Using Salesforce.com, Altium developed and deployed new quoting, purchase order, project management, inventory management, campaign management, and support management systems across its global employees’ base in a matter of weeks.

There is an e-mail that I sent four or five years ago to Salesforce based on our Salesforce experience, and later they responded to me saying that it had a huge impact on their product direction. We manufacture what we call the NanoBoard, which is a piece of hardware you can plug a FPGA into. Then you can plug an entire range of different peripherals into it, such as electronic breadboards, which enables you to get a prototype back in minutes rather than waiting for weeks.

Now, for the NanoBoard and all its peripheral devices, when we manufacture it in China, we burn a record ID into the physical board that is stored in our Salesforce infrastructure. Along with that, we plug in an entire range of things that are stored in the cloud and then run on the boards. The test results for these devices are stored back into the Salesforce deployment. When customers plug in any device to their USBs on their computers, they gets an overlay of the board which is composite photograph of all the currently plugged-in peripherals. This is downloaded from Amazon S3 based on that particular device and its peripherals. It then shows over the top of that the serial number, the bench that it was worked on, which batch was it manufactured from, and a history of all the test results and so forth for that particular board. Now, our intention is to provide that kind of electronic production management facilities to our customers so that they can provide it to their customers. You simply can’t think that way if you don’t have a cloud-based approach in your fundamentals. Our financial systems are all based in the cloud. We are heavy users of Google Docs and so forth. We use these cloud-based systems in very strong ways.

SM: I presume you moved from a non-cloud environment, one which is more a license software model, to a cloud-based environment. What did that do to your cost structure?

AP: That is a good question! We had some interesting challenges with trying to drive the business as we tried to grow. We wanted to be able to cope with the complexities of operating in a range of different markets. Initially, we developed one system that covered everything from finance to opportunity pipeline management and sales project management, issue tracking, and so on. It handled processes such as inventory and everything related to it. We developed all of this in-house; it was a multilingual, multicurrency, multi tax-jurisdictional sort of product. We were looking at replacing that one system we developed with some of the sales opportunity management stuff – with something like a CRM tool off the shelf. We initially looked at Oracle and found that just the cost of implementing, forget the cost of the software, but just the cost of deploying it, was going to be $600,000. Now, the money didn’t worry me so much, but what did worry me was the complexity it implied.

By moving away from improving our in-house developed systems to going to Salesforce and extending that to other systems as well, we have been able to reduce our internal IT stuff by probably around 40% across the board. Possibly even 50%. I haven’t sat down and quantified that, but there have been significant savings in terms of that sort of thing and also in terms of e-mail and communications systems. Earlier, we had 14 mail servers around the world, which were all kinds of problems to manage. I remember at one point our president was a bit of an e-mail abuser with 50-megabyte attachments and so forth, which we shouldn’t have allowed in the first place. She didn’t use e-mail systems in the usual way they were designed to be used. I remember 140 hours had to be spent fixing her e-mail because it got corrupted. We replaced all of these 14 mail servers with Google’s e-mail service. Now we use Gmail and that costs us, including a Postini and recovery system which has 10 years’ retention – that cost us $75 per person per year. It is nowhere near the cost of what it was costing us to manage this before. This cost is nothing like the cost we may have paid if we went to the Exchange server or if we want to replace all of this with a collaborative Exchange server backbone. We have 500 e-mail accounts, and that costs very little indeed compared to our earlier in-house setup.

SM: Interesting! What do you do about large file transfers? What you have just alluded to is a real problem in your industry, right? In electronic design automation, in fact in all kinds of design automation, the files that need to be transferred are large.

AP: Yes, the files are large. We have our builds, which currently [use] about 1.8 gigabytes for our software builds. Then of course you have the design files and whatnot. We get them out using Amazon S3 and cloud front to our customers. Earlier we used to use Akamai, and we found that the Amazon service was much faster and much cheaper. Now we use Amazon cloud and S3 to deliver our builds to 40,000 people at once. We are moving to a new model with the delivery of that where it is going to be a continual stream of systems updates rather than a sort of monolithic six-monthly thing. It has taken us a while to make that change, but it is a very effective change in our delivery model. But the actual delivery of those to the cloud is simply a question of having parallel streams. In terms of the data files and so forth that we have floating around the office in terms of file transfer, it really hasn’t been that big an issue.

For large files, I would say the biggest issue is for our marketing and design departments, which write videos and such. The development videos can be terabytes in size; that is something that at this stage we can’t do in the cloud because they are just too big. We have 10-gigabyte connections with people in a Saudi office and a large array of 50–100 terabytes of disk space to support the kind of stuff they use. But that is really the only area where we are storing stuff in-house because of the file size limits.

This segment is part 3 in the series : Thought Leaders In Cloud Computing: Alan Perkins, CIO of Altium
1 2 3 4 5 6 7

Hacker News
() Comments

Featured Videos