Adbrite

Your Ad Here

Adbrite

Your Ad Here

Wednesday, July 28, 2010

Teaching Linear Programming using Microsoft Excel Solver

Ziggy MacDonald
University of Leicester
Linear programming (LP) is one of the most widely applied O.R. techniques and owes its popularity principally to George Danzig's simplex method (Danzig 1963) and the revolution in computing. It is a very powerful technique for solving allocation problems and has become a standard tool for many businesses and organisations. Although Danzig's simplex method allows solutions to be generated by hand, the iterative nature of producing solutions is so tedious that had the computer never been invented then linear programming would have remained an interesting academic idea, relegated to the mathematics classroom. Fortunately, computers were invented and as they have become so powerful for so little cost, linear programming has become possibly one of the most widespread uses for a personal PC.
There are of course numerous software packages which are dedicated to solving linear programs (and other types of mathematical program), of which possibly LINDO, GAMS and XPRESS-MP are the most popular. All these packages tend to be DOS based and are intended for a specialist market which requires tools dedicated to solving LPs. In recent years, however, several standard business packages, such as spreadsheets, have started to include an LP solving option, and Microsoft Excel is no exception. The inclusion of an LP solving capability into applications such as Excel is attractive for at least two reasons. Firstly, Excel is perhaps the most popular spreadsheet used both in business and in universities and as such is very accessible. Second to this, the spreadsheet offers very convenient data entry and editing features which allows the student to gain a greater understanding of how to construct linear programs.
To use Excel to solve LP problems the Solver add-in must be included. Typically this feature is not installed by default when Excel is first setup on your hard disk. To add this facility to your Tools menu you need to carry out the following steps (once-only):
  1. Select the menu option Tools | Add_Ins (this will take a few moments to load the necessary file).
  2. From the dialogue box presented check the box for Solver Add-In.
  3. On clicking OK, you will then be able to access the Solver option from the new menu option Tools | Solver (which appears below Tools | Scenarios ...)
To illustrate Excel Solver I will consider Hillier & Lieberman's reasonably well known example, the Wyndor Glass Co. problem (Hillier & Lieberman, 1995). The problem concerns a glass manufacturer which uses three production plants to assemble its products, mainly glass doors (x1) and wooden frame windows (x2). Each product requires different times in the three plants and there are certain restrictions on available production time at each plant. With this information and a knowledge of contributions to profit of the two products the management of the company wish to determine what quantities of each product they should be producing in order to maximise profits. In other words, the Wyndor Glass Co. problem is a classic, albeit very simple, product-mix problem.
The problem is formulated as the following linear program:
max z = 3x1 + 2x2   (objective)

     subject to       x1 <= 4  (Plant One)
                     2x2 <= 12 (Plant Two)
               3x1 + 2x2 <= 18 (Plant Three)

                  x1, x2 >=  0  (Non-negativity requirements)

     where          z = total profit per week
               x1 = number of batches of doors produced per week
               x2 = number of batches of windows produced per week.
Having formulated the problem, and yours may have substantially more decision variables and constraints, you can then proceed to entering it into Excel. The best approach to entering the problem into Excel is first to list in a column the names of the objective function, decision variables and constraints. You can then enter some arbitrary starting values in the cells for the decision variables, usually zero, shown in Figure One. Excel will vary the values of the cells as it determines the optimal solutions. Having assigned the decision variables with some arbitrary starting values you can then use these cell references explicitly in writing the formulae for the objective function and constraints, remembering to start each formula with an '=' .
Figure 1
Figure One: Setting up the problem in Excel
Entering the formulae for the objective and constraints, the objective function in B5 will be given by :
=3*B9+2*B10
The constraints will be given by (putting the right hand side {RHS} values in the adjacent cells):
Plant One (B14)     =B9  
Plant Two (B15)     =2*B10
Plant Three    (B16)     =3*B9+2*B10
Non-neg 1      (B17)     =B9
Non neg 2      (B19)     =B10
You are now ready to use Solver.
On selecting the menu option Tools | Solver the dialogue box shown in Figure Two is revealed, and if you select the objective cell before invoking Solver the correct Target Cell will be identified. This is the value Solver will attempt either to maximise or minimise.
Figure 2
Figure Two: The Solver Dialogue Box
Select whether you wish to minimise this or maximise the problem, in this case you would want to set the target cell (the objective) to a Max. Note that you can use Solver to find the outcome that will achieve a specified value for the target cell by clicking 'Value of:'. In doing this you can use Solver as a glorified goal seeker.Next you enter the range of cells you want Solver to vary, the decision variables. Click on the white box and select cells B9 & B10, or alternatively type them in. Note that you can try to get Solver to guess which cells you want to vary by clicking the 'Guess' button. If you have defined your problem in a logical way Solver should usually get these right.
You can now enter the constraints by first clicking the 'Add ..' button. This reveals the dialogue box shown in Figure Three.
Figure 3
Figure Three: Entering Constraints
The cell reference is to the cell containing your constraint formula, so for the Plant One constraint you enter B14. By default <= is selected but you can change this by clicking on the drop down arrow to reveal a list of other constraint types. In the right hand white box you enter the cell reference to the cell containing the RHS value, which for the Plant One constraint is cell C14. You then click 'Add' to add the rest of the constraints, remembering to include the non-negativity constraints.
Having added all the constraints, click 'OK' and the Solver dialogue box should look like that shown in Figure Four.
Figure 4
Figure Four: The Completed Solver Dialogue Box
Before clicking 'Solve' it is good practice when doing LPs to go into the Options and check the 'Assume Linear Model' box, unless, of course, your model isn't linear (Solver can handle most mathematical program types, including non-linear and integer problems). Doing this can speed up the length of time taken for Solver to find a solution to the problem and in fact, it will also ensure the correct result and quite importantly, provide the relevant sensitivity report. Having selected this option you are now ready to Click 'Solve' and see Solver find the optimal values for doors and windows. On doing this, at the bottom of the screen Excel will inform you of Solver's progress, then on finding an optimal solution the dialogue box shown in Figure Five will appear. You will also observe that Solver has altered all the values in your spreadsheet, replacing them with the optimal results.
You can use the Solver Results dialogue box to generate three reports. To select all three at once, either hold down CTRL and click each one in turn or drag the mouse over all three.
Figure 5
Figure Five: Solver Results
At the same time it's often a good idea to get Solver to restore your original values in the spreadsheet so that you can return to the original problem formulation and make adjustments to the model such as altering the availability of resources. The three reports are generated in new sheets in the current workbook of Excel.
The Answer Report gives details of the solutions (in this case, profit is maximised at 36 when 2 doors per week are produced and 6 windows per week - not a particularly busy firm!) and information concerning the status of each constraint with accompanying slack/surplus values is provided. The Sensitivity Report for the Wyndor problem, which provides information about how sensitive your solution is to changes in the constraints, is shown in Figure Six.
Figure 6
Figure Six: Sensitivity Report for Wyndor
As you can see from Figure Six, the report is fairly standard, providing information on shadow values, reduced cost and the upper and lower limits for the decision variables and constraints. The Limits Report also provides sensitivity information on the RHS values. All the reports can simply be copied and pasted into Word and this is perhaps one of the big advantages of using Excel over a DOS based LP solver. Although the reports paste into Word as tables, they are easily converted into text and can then be manipulated if one is producing a written report on your finding.
Finally, there are several options to Solver that can allow you to amend/intervene in the solution generating process. The 'Options' button in the Solver dialogue box reveals the dialogue box shown in Figure Seven. You can use this to affect how accurate your solution is, how much 'effort' Solver puts into to finding the solution and whether you want to see the results of each iteration.
Figure 7
Figure Seven: Solver Options
The Tolerance option is only required for integer programs (IP), and allows Solver to use 'near integer' values, within the tolerance you specify, and this helps speed up the IP calculations. Checking the Show Iteration Results box allows you to see each step of the calculation, but be warned, if your model is complex this can take an inordinate length of time. Use Automatic Scaling is useful if there is a huge difference in magnitude between your decision variables and the objective value.
The bottom three options, Estimates, Derivatives and Search affect the way Solver approaches finding a basic feasible solution, how Solver finds partial differentials of the objective and constraints, and how Solver decides which way to search for the next iteration. Essentially the options affect how solver uses memory and the number of calculations it makes. For most LP problems, they are best left as the default values.
The 'Save Model' button is very useful, particularly if you save your model as a named scenario. Clicking this button allows you to assign a name to the current values of your variable cells. This option then allows you to perform further 'what-if' analysis on a variety of possible alternative outcomes - very useful for exploring your model in greater detail.
In conclusion, Excel Solver provides a simple, yet effective, medium for allowing users to explore linear programs. It can be used for large problems containing hundreds of variables and constraints, and does these relatively quickly, but as a teaching tool using small illustrative problems it is very potent, particularly as the student must appreciate the structure of a LP when entering it into the spreadsheet.
On the downside, you can't view the Tableau as it is generated at each iteration and so those teachers who would want their students to be proficient in the manual methods of LP would find Solver less superior to, say, Lindo, which does allow this. It does, however, produce a superior set of results and sensitivity reports when compared to Lindo, and, due to the spreadsheet nature, does allow the student very quickly to observe the effects of any changes made to constraints or the objective function.
This is particularly noticeable as the model formulation is easily accessible at the same time as the model results, these being simply placed in adjacent worksheets, accessible with a simple mouse click. Overall, using Excel, which is familiar to a large number of students, provides a rich environment for teaching linear programming and it allows students to explore their models in a structured, yet flexible, way.

References

  1. George B. Dantzig (1963) "Linear Programming and Extensions", Princeton University Press, Princeton, N.J.
  2. Frederick Hillier & Gerald Lieberman (1995) "Introduction to Operations Research", sixth edition, McGraw-Hill
  3. The Cobb Group (1994) "Running Excel 5 for Windows", fourth edition, Microsoft Press
The author may be contacted at the following address:
Ziggy MacDonald, Department of Economics, University of Leicester, University Road, Leicester LE1 7RH, United Kingdom. Telephone +44 (0)116 252 2894, Fax +44 (0) 252 2908, e-mail abm1@le.ac.uk

Friday, July 9, 2010

Multiplying DNA One Drop at a Time

RainDance Technologies says its method of amplifying DNA in drops of water will expand clinical genetic testing.

As the cost of DNA sequencing continues to fall and scientists discover a growing number of genes linked to different diseases, the field of genetic diagnostics is preparing for a boom. Rather than the single-gene tests common today, clinical genetics laboratories are developing tests that simultaneously detect tens or even hundreds of genetic mutations linked to cancer and other diseases, as well as conditions such as mental retardation.
However, with these more complex tests, diagnostic developers need to be able efficiently and accurately select specific portions of the genome for analysis. "Now the cost of sequencing is so cheap you don't have to look at just one or two genes, you can look much more broadly," says Alexis Borisy, entrepreneur in residence at Third Rock Ventures and acting chief executive of Foundation Medicine, a Cambridge, MA-based startup developing genetic tests for analyzing cancer. "But the whole genome or exome [the portion of the genome that codes for proteins] is still too expensive to be clinically useful, so you have to focus the search."
RainDance, a startup based in Lexington, MA, aims to fill that gap with its droplet-based microfluidics technology. Founded in 2004, the company uses picoliter-sized droplets as tiny test tubes to carry out chemical reactions at very small volumes. Precisely sized droplets are created on a microfluidics chip by surrounding aqueous liquid with small volumes of oil. The droplets, generated at a rate of 10 million per hour, can be tightly packed and injected with different reagents, including strands of DNA. To catalyze a reaction, an electrical signal triggers droplets containing different reagents to merge.

Tuesday, June 8, 2010

The best way to discharge a mobile phone battery

to leave it on the phone till the phone switches off.


Nickel Cadmium (NiCd)



All the foregoing applies to NiCd batteries. To get the best life out of a NiCd, let it run down every second or third charge. Do it more often and you shorten its overall life: do it less often, and you risk reducing its charge capacity.

Nickel Metal Hydride (NiMH)



NiMH batteries need much the same care as NiCd, except that you only need to run them down every week or two, if they are charged every night.

Lithium Ion (Li-Ion)



Lithium Ion batteries are very different. You should not deliberately discharge a Li-Ion cell. In fact, if you were to manage to run one flat, it would probably be damaged. There is electronics inside each Li-Ion battery to protect it from such abuse, but don't take the risk!



To keep your Li-Ion battery in good shape, simply charge it overnight before it runs down. If a full battery at all times matters to you, you can top it up whenever you like, but you'll probably get a longer service life from it if you only recharge it when it is getting a bit low.

Storage



Batteries of any type don't like to be left discharged. In general, if you have a spare battery, it is probably best to use it alternately with its partner.

Declining years



Age and infirmity come to all of us, but mobile phone batteries get there quicker than their users!

NiCd



A NiCd battery will lose its charge capacity, and may run flat on its own. This is often caused by sharp, spiky crystals growing through the separators of the cell, causing a short circuit. It is possible to "flash" these away by applying a very high current (such as from a large battery) for a short while. The current through the spike will melt it away, curing the short circuit, but that's not really a cure: the hole in the insulator will still be there, and there will probably be other crystals poised to do the same in another place. If your NiCd battery has managed 700 or more charge cycles, or has been exposed to excessive heat or other abuse, replace it!

NiMH



A tired NiMH battery will probably give good standby times, but as soon as you make or receive a call, you'll discover that it can't provide the current needed. This is because age and heat cause the crystals inside the cell to get bigger, which means that their surface area falls in proportion to their volume. Unfortunately, there is nothing much you can do about this. If a NiMH battery has managed 500 or more charge cycles, it has done well. Time for a replacement!



Li-Ion



Li-Ion batteries can fail suddenly, possibly because the electronics inside it have gone wrong, but in general they simply fade away. Because the capacity falls gradually over the charge cycle life, when to replace it is a matter of when the charge capacity is no longer sufficient for your needs. Never try to revitalise a Li-Ion battery in any way, or expose it to excessive heat: the very high power density of Li-Ion makes such actions very dangerous.

Replacement



Because of the subsidy system, it is often cheaper to upgrade to a new model of phone (complete with new battery) than it is to buy a new battery. Having said that, it really is worth replacing a worn-out battery. It is common for people to remark that they wish they'd bought a new battery sooner - putting it off is rarely wise!

Disposal



When it is time to say good bye to an old, tired, battery, don't throw it on a fire: it could explode. Don't put it in your dustbin: there should be facilities for recycling rechargeable batteries provided by your local council.

NTT DoCoMo begins testing LTE network

Japan's NTT DoCoMo began testing on Tuesday of a new cellular data network that should ensure Tokyo remains one of the fastest places on the planet to send and receive data via cell phone.




The new network is scheduled to go into operation in December this year and should initially deliver upload speeds of up to 25M bps (bits per second) and downloads of 75Mbps. The speeds are respectively 5 and 10 times faster than NTT DoCoMo's current fastest service.



The new network is based on a technology called LTE (Long-Term Evolution), an IP-based system seen as a replacement for 3G-based HSPA (High Speed Packet Access). Its introduction will not only mean faster data transfers but could also reduce the per-byte cost of data communications.



NTT DoCoMo began building the LTE network in December last year and the tests that began on Tuesday will verify the network for speed, latency, stability of inter-cell handover and other factors important to a commercial service.



The December launch will be for a data communications service and DoCoMo plans to begin selling its first LTE-compatible handsets in 2011. Initial service will be restricted to Tokyo but 50 percent of populated areas are expected to be covered by 2014.



The network is being built with an investment of between ¥300 billion and ¥400 billion (US$3.3 billion to $4.4 billion) during the first five years of the roll-out, and comes with an eye on the future.



Eventually NTT DoCoMo expects to offer even faster speeds via the new network. Future upgrades will push download speeds as fast as 300Mbps and upload speeds to 75Mbps.

Sunday, May 30, 2010

NPT session approves steps on nuclear-free Middle East

nited Nations: A landmark conference to curb the spread of nuclear weapons agreed here Friday on talks toward the establishment of a nuclear weapon-free zone in the Middle East. It was the first agreement in a decade on the nuclear Non-Proliferation Treaty (NPT), which since 1970 has set the global agenda for keeping countries from getting the...

Saturday, May 29, 2010

July Workshop Devoted to Improving Usability of Health Care IT

July Workshop Devoted to Improving Usability of Health Care IT


For Immediate Release: May 25, 2010



Contact: Ben Stein

301-975-3097





Improving the ease of use of information systems for the health care industry could significantly facilitate the adoption of technology that has great potential to improve the quality of health care while reducing costs. The goal is to allow medical professionals interact with health care information technology quickly and easily to support their primary tasks rather than complicate them. Along the way, steps must be taken to ensure that health IT systems are accessible to people with disabilities.



Toward these ends, the National Institute of Standards and Technology (NIST) will be hosting a one-day workshop on improving the usability—ease of use—of health IT. Co-sponsored by the Department of Health and Human Services Office of the National Coordinator and the Agency for Healthcare Research and Quality, the workshop will take place on July, 13, 2010, at the NIST Gaithersburg campus.



The goal of the workshop is to promote collaboration in health IT usability among federal agencies, industry, academia and others. Attendees will discuss ways to prioritize, align and coordinate short-, medium-, and long-term strategies and tactics to improve the usability of electronic health records (EHRs). Specific objectives of the workshop will be to establish an immediate-term set of actions to inform the national initiative to drive the adoption and meaningful use of EHRs; develop a strategic approach to measure and assess the use of EHRs and the impact of usability on their adoption and innovation; develop strategies to drive best practices and innovation in health care IT; and inspire follow-on activities in health care IT usability.

Friday, May 28, 2010

PCI tokenization guidance could benefit payment processors

In a recent interview, Bob Russo, general manager of the PCI SSC said he didn't expect any major changes to the data security standards (PCI DSS), which is undergoing a revision this year. But guidance documents are being developed to help merchants decide whether investing in encryption or PCI tokenization technologies is a wise move.



"We're creating a framework right now where we map these technologies out and lay them next to the standards, so if somebody is using one of these technologies, [the framework] will let them know if they would satisfy certain requirements," Russo said.



Some payment processors and encryption vendors have rallied around a mixture of encryption and tokenization software to protect card data at the time a customer swipes their credit card at a payment terminal. RSA, the security division of EMC Corp. is working with First Data to provide tokenization technology in the encryption services that First Data sells to merchants. Voltage Security Inc. sells an encryption combined with tokenization and is working closely with Heartland Payment Systems to provide encryption and tokenization services.

Thursday, May 27, 2010

Time to kill the private cloud?

Amazon.com CTO weighs in on vendor marketing.

Public cloud compute pioneer Amazon.com has slammed the marketing campaigns of the world's largest hardware and software vendors over the use of the phrase "private cloud."



"Private cloud" is used by a variety of vendors to describe virtualised hardware, software and networking stacks that are able to offer an enterprise a consolidated pool of server compute and storage in a "utility-like" fashion.



The "public cloud", by contrast, sees these same IT resources delivered over the internet in an "on-demand" elastic fashion by large external service providers and sold on a "pay as you go" pricing model.



IT equipment labelled "private cloud" is often sold on the basis that organisations may choose to operate in a "hybrid" model in future - choosing which applications and services are hosted in-house and easily pushing others to public clouds as the need arises.



But Amazon web services, which pioneered Infrastructure-as-a-service (via its EC2 product) and storage-as-a-service (via its S3 product) has had a gutful of the terminology.



Amazon.com chief technology officer Dr Werner Vogels [pictured] told delegates at CeBIT this week that even analyst group Gartner's definition of cloud computing is flawed, as it omits important considerations such as "on demand" and "pay as you go pricing".



Dr Vogels compared buying your own "private cloud" computing equipment with relying exclusively on your own diesel generators instead of connecting to the electricity grid.



Friday, May 21, 2010

Google rolls out IT-friendly Android OS upgrade

Google detailed on Thursday its Android 2.2 OS, featuring enterprise-level enhancements as well as a speed boost and Flash support.
Codenamed "Froyo," for frozen yogurt, Android 2.2 includes more than 20 new features geared to enterprises, said Google's Vic Gundotra, vice president of engineering. Among these is integration with the Microsoft Exchange messaging system, with such capabilities as account auto-discovery and linkage with the Exchange global address book. Calendar synchronization is offered as well.
[ InfoWorld's Paul Krill reported that HTML5 also is getting a lot of attention at the I/O conference. | Stay up on tech news and reviews from your smartphone at infoworldmobile.com. | Keep up on key mobile developments and insights with the Mobile Edge blog and Mobilize newsletter. ]
 "Number 1, we've become Microsoft Exchange-friendly," Gundotra said in introducing Android 2.2 at the Google I/O conference in San Francisco.
Also in the enterprise space, APIs are featured for device policy management, enabling developers to write applications that control security features like remote wipe, minimum password, and lockscreen timeout, according to Google.
"As Android adoption [has] skyrocketed, people have been taking these devices to work," Gundotra said.
Froyo offers an application data backup API and a cloud-to-device messaging API. Devices running Android 2.2 also can serve as a portable hotspot for network access.
The OS upgrade features a two-to-five-times speed improvement for applications via use of a just-in-time compiler functioning with the Dalvik virtual machine.  
Android 2.2 will be made available to equipment manufacturers and the open source community in coming weeks; developers can download the Android SDK and NDK (native development kit) from the Android developer site.
Android 2.2 supports the Flash 10.1 browser and Adobe AIR (Adobe Integrated Runtme). "It turns out that people actually use Flash," Gundotra said, in an obvious swipe at Apple's refusal to allow Flash on devices such as the iPhone and iPad. Adobe, meanwhile, has just released the public beta version of Flash Player 10.1 for Android, an Adobe representative said.
Browser capabilities in Android 2.2 are being enhanced with inclusion of the V8 JavaScript engine now featured in the Google Chrome browser.
"It's critically important for us to make the Android browser rock, and we're going to constantly improve that browser. Froyo is a major step in that direction," Gundotra said.
"We can claim Froyo has the world's fastest mobile browser," he said.
Users also will be able to access Android camera capabilities via the browser. Voice input is again featured as well, for informational queries.
"We're going to make it very simple to use voice input," as a way to interact with an Android device, said Gundotra.
Gundotra said there are now 100,000 Android activations daily.
Novell, for its part, is announcing MonoDroid, a software development kit for building Android applications using code and libraries written for the Microsoft .Net development framework and languages like C#. MonoDroid functions with the Android SDK.
In the consumer vein, Google Thursday morning also announced Google TV, an effort to integrate Web browsing  capabilities into TV sets.
This article, "Google rolls out Android OS upgrade," was originally published at InfoWorld.com. Follow the latest developments in business technology news and get a digest of the key stories each day in the InfoWorld Daily newsletter and on your mobile device at infoworldmobile.com.

Thursday, May 20, 2010

One in four households now has only mobile phones

One in four U.S. households now has only wireless telephone service, a recent U.S. government study has found.

The report also drew some interesting correlations with the health of wireless users, whose homes were exposed to binge drinking at nearly twice that of adults in homes with landlines.

Mobile phone-only adults were also more likely to be current smokers and more likely to experience "serious psychological distress," said the report from the Centers for Disease Control and Prevention , released May 12.

The CDC did not say wireless phones caused any of the noted health problems it noted, but it documented that wireless-only homes were more heavily populated by younger people and unrelated adults, groups generally exposed to more drinking and smoking.

The percentage of homes with only mobile phones, 24.5% in the last half of 2009, represented an increase of 1.8 percentage points since the first half of 2009, the survey of more than 21,000 households found.

The steady increase in wireless-only homes has been reported by the CDC since 2003, when about 3% of homes in the U.S. were wireless.

Carl Howe, an analyst at Yankee Group, said in a blog post that the growth of wireless-only homes was "an amazing statistic."

However, he and other analysts are keenly aware of ways telecom providers in the U.S. have been gradually converting networks and business plans toward wireless.

Yankee Group also said that the national average of wireless-only homes understates results that it has found in surveys of 14,000 consumers, conducted in 2009. At that time, it found wireless-only homes exceeded 28%. In states such as Arkansas, North Carolina and Ohio, more than 40% of the homes had only wireless service, Howe said.

In some rural states where wired telecommunications are expensive or complicated to install, a majority of homes have cut their landlines -- at least based on survey results using a sample size of less than 50 people, Yankee said. Those states include Idaho, Wyoming and North Dakota.

The CDC didn't say that wireless phones caused binge drinking or smoking or psychological distress and merely reported a correlation. Binge drinking was reported in 34.5% of wireless-only homes, compared with 18.7% of homes with wired phones.

The CDC didn't report how much more likely wireless-only homes had smokers or those with psychological problems.

In another correlation, the CDC discovered that wireless phones were used widely in homes occupied by unrelated adult roommates, confirming the prevalence of wireless by young people and college residents seen by wireless carriers and college administrators.

The CDC said nearly two-thirds (62.9%) of adults living only with unrelated adults were also in homes that were wireless-only.

Forty-three percent of renters had wireless phones only, and 49% of adults aged 25 to 29 had only wireless phones in their homes.

For groups aged between 18 and 24, and between 30 and 34, about 37% lived in wireless-only homes. Men were slightly more likely than women (by 25% to 21%) to live in homes with only wireless phones

Tuesday, May 18, 2010

Microsoft Security - Six Years Later

On January 15, 2002, Bill Gates sent email to every full-time employee at Microsoft, in which he describes the company’s new strategy emphasizing security in its products. In the email Gates referred to the new philosophy as “Trustworthy Computing” and called it the “highest priority”.




The Computerworld posting Microsoft Can’t Claim Victory in Security Battle picks up the story.



As Gates officially retires from his job at Microsoft, he leaves behind a company that by most accounts is doing better on security. But fully convincing users of that is an elusive goal. And increasing competition from Web 2.0 and software-as-a-service (SaaS) vendors is posing new challenges for the security development model implemented after Gates wrote his memo.



There is general agreement that bugs are inevitable and that Microsoft’s massive user base makes it a big target for attackers. But the steady drumbeat of patch releases has tarnished the company’s efforts to improve its security standing, …



The original blog posting “Trustworthy Computing” - Yea, Right, Sure

was posted soon after the “Trustworthy Computing” memo hit the Web. It has been updated since.



Yea, right, sure, Bill. Sending Microsoft coders off to security and reliability coding school is going to make thing all better real soon. If you believe that, I have a bridge to sell you. Anyone sent off to training comes back knowing some new buzzwords and maybe even understanding a couple new concepts.



I applaud the effort, but it takes a very long time to break old coding habits and internalize new ones, no matter what the punishments and rewards are. No one comes back cleansed of old habits. I’m reminded of the limerick that you can train a dog but you can’t make it think.



I think the problem facing Microsoft is systemic. In my opinion, poorly designed code and poor coding practices may be at the heart of the Microsoft security and stability epidemic. Detecting and eradicating them may be impossible.



If it could be done, the effort may cost many times that of developing and testing the product line in the first place. Automated tools will help pick off the very low hanging fruit, but won’t get anywhere near the really nasty problems that seem to exist throughout Microsoft’s product line.



Bill Gates seems to have made choices about security and reliability early on. There’s no practical way to rectify them now, except maybe by starting from scratch.



Even starting over with Vista won’t fix the problem. The real culprit may be Microsoft’s corporate culture created by Bill Gates. Getting a culture’s head straight is a very difficult, if not an impossible task.



In my opinion, the fundamental problem facing Microsoft isn’t a technology one but a human one. I don’t think any amount of training or engineering will fix it.



Besides corporate culture, I don’t think starting over is a likely option for Microsoft, as I discussed in the Obese Windows blog posting.



Microsoft security issues are getting better. I don’t foresee them improving to the state of common contemporary operating systems such as Mac OS X, Linux, Solaris, HP/UX, AIX, Free BSD, Open BSD, etc…



I also don’t expect seeing the company culture change radically. Bill Gates may have left the building but he is still Chairman of the Board and the company’s largest share holder.