Adbrite

Your Ad Here

Adbrite

Your Ad Here

Wednesday, July 28, 2010

Teaching Linear Programming using Microsoft Excel Solver

Ziggy MacDonald
University of Leicester
Linear programming (LP) is one of the most widely applied O.R. techniques and owes its popularity principally to George Danzig's simplex method (Danzig 1963) and the revolution in computing. It is a very powerful technique for solving allocation problems and has become a standard tool for many businesses and organisations. Although Danzig's simplex method allows solutions to be generated by hand, the iterative nature of producing solutions is so tedious that had the computer never been invented then linear programming would have remained an interesting academic idea, relegated to the mathematics classroom. Fortunately, computers were invented and as they have become so powerful for so little cost, linear programming has become possibly one of the most widespread uses for a personal PC.
There are of course numerous software packages which are dedicated to solving linear programs (and other types of mathematical program), of which possibly LINDO, GAMS and XPRESS-MP are the most popular. All these packages tend to be DOS based and are intended for a specialist market which requires tools dedicated to solving LPs. In recent years, however, several standard business packages, such as spreadsheets, have started to include an LP solving option, and Microsoft Excel is no exception. The inclusion of an LP solving capability into applications such as Excel is attractive for at least two reasons. Firstly, Excel is perhaps the most popular spreadsheet used both in business and in universities and as such is very accessible. Second to this, the spreadsheet offers very convenient data entry and editing features which allows the student to gain a greater understanding of how to construct linear programs.
To use Excel to solve LP problems the Solver add-in must be included. Typically this feature is not installed by default when Excel is first setup on your hard disk. To add this facility to your Tools menu you need to carry out the following steps (once-only):
  1. Select the menu option Tools | Add_Ins (this will take a few moments to load the necessary file).
  2. From the dialogue box presented check the box for Solver Add-In.
  3. On clicking OK, you will then be able to access the Solver option from the new menu option Tools | Solver (which appears below Tools | Scenarios ...)
To illustrate Excel Solver I will consider Hillier & Lieberman's reasonably well known example, the Wyndor Glass Co. problem (Hillier & Lieberman, 1995). The problem concerns a glass manufacturer which uses three production plants to assemble its products, mainly glass doors (x1) and wooden frame windows (x2). Each product requires different times in the three plants and there are certain restrictions on available production time at each plant. With this information and a knowledge of contributions to profit of the two products the management of the company wish to determine what quantities of each product they should be producing in order to maximise profits. In other words, the Wyndor Glass Co. problem is a classic, albeit very simple, product-mix problem.
The problem is formulated as the following linear program:
max z = 3x1 + 2x2   (objective)

     subject to       x1 <= 4  (Plant One)
                     2x2 <= 12 (Plant Two)
               3x1 + 2x2 <= 18 (Plant Three)

                  x1, x2 >=  0  (Non-negativity requirements)

     where          z = total profit per week
               x1 = number of batches of doors produced per week
               x2 = number of batches of windows produced per week.
Having formulated the problem, and yours may have substantially more decision variables and constraints, you can then proceed to entering it into Excel. The best approach to entering the problem into Excel is first to list in a column the names of the objective function, decision variables and constraints. You can then enter some arbitrary starting values in the cells for the decision variables, usually zero, shown in Figure One. Excel will vary the values of the cells as it determines the optimal solutions. Having assigned the decision variables with some arbitrary starting values you can then use these cell references explicitly in writing the formulae for the objective function and constraints, remembering to start each formula with an '=' .
Figure 1
Figure One: Setting up the problem in Excel
Entering the formulae for the objective and constraints, the objective function in B5 will be given by :
=3*B9+2*B10
The constraints will be given by (putting the right hand side {RHS} values in the adjacent cells):
Plant One (B14)     =B9  
Plant Two (B15)     =2*B10
Plant Three    (B16)     =3*B9+2*B10
Non-neg 1      (B17)     =B9
Non neg 2      (B19)     =B10
You are now ready to use Solver.
On selecting the menu option Tools | Solver the dialogue box shown in Figure Two is revealed, and if you select the objective cell before invoking Solver the correct Target Cell will be identified. This is the value Solver will attempt either to maximise or minimise.
Figure 2
Figure Two: The Solver Dialogue Box
Select whether you wish to minimise this or maximise the problem, in this case you would want to set the target cell (the objective) to a Max. Note that you can use Solver to find the outcome that will achieve a specified value for the target cell by clicking 'Value of:'. In doing this you can use Solver as a glorified goal seeker.Next you enter the range of cells you want Solver to vary, the decision variables. Click on the white box and select cells B9 & B10, or alternatively type them in. Note that you can try to get Solver to guess which cells you want to vary by clicking the 'Guess' button. If you have defined your problem in a logical way Solver should usually get these right.
You can now enter the constraints by first clicking the 'Add ..' button. This reveals the dialogue box shown in Figure Three.
Figure 3
Figure Three: Entering Constraints
The cell reference is to the cell containing your constraint formula, so for the Plant One constraint you enter B14. By default <= is selected but you can change this by clicking on the drop down arrow to reveal a list of other constraint types. In the right hand white box you enter the cell reference to the cell containing the RHS value, which for the Plant One constraint is cell C14. You then click 'Add' to add the rest of the constraints, remembering to include the non-negativity constraints.
Having added all the constraints, click 'OK' and the Solver dialogue box should look like that shown in Figure Four.
Figure 4
Figure Four: The Completed Solver Dialogue Box
Before clicking 'Solve' it is good practice when doing LPs to go into the Options and check the 'Assume Linear Model' box, unless, of course, your model isn't linear (Solver can handle most mathematical program types, including non-linear and integer problems). Doing this can speed up the length of time taken for Solver to find a solution to the problem and in fact, it will also ensure the correct result and quite importantly, provide the relevant sensitivity report. Having selected this option you are now ready to Click 'Solve' and see Solver find the optimal values for doors and windows. On doing this, at the bottom of the screen Excel will inform you of Solver's progress, then on finding an optimal solution the dialogue box shown in Figure Five will appear. You will also observe that Solver has altered all the values in your spreadsheet, replacing them with the optimal results.
You can use the Solver Results dialogue box to generate three reports. To select all three at once, either hold down CTRL and click each one in turn or drag the mouse over all three.
Figure 5
Figure Five: Solver Results
At the same time it's often a good idea to get Solver to restore your original values in the spreadsheet so that you can return to the original problem formulation and make adjustments to the model such as altering the availability of resources. The three reports are generated in new sheets in the current workbook of Excel.
The Answer Report gives details of the solutions (in this case, profit is maximised at 36 when 2 doors per week are produced and 6 windows per week - not a particularly busy firm!) and information concerning the status of each constraint with accompanying slack/surplus values is provided. The Sensitivity Report for the Wyndor problem, which provides information about how sensitive your solution is to changes in the constraints, is shown in Figure Six.
Figure 6
Figure Six: Sensitivity Report for Wyndor
As you can see from Figure Six, the report is fairly standard, providing information on shadow values, reduced cost and the upper and lower limits for the decision variables and constraints. The Limits Report also provides sensitivity information on the RHS values. All the reports can simply be copied and pasted into Word and this is perhaps one of the big advantages of using Excel over a DOS based LP solver. Although the reports paste into Word as tables, they are easily converted into text and can then be manipulated if one is producing a written report on your finding.
Finally, there are several options to Solver that can allow you to amend/intervene in the solution generating process. The 'Options' button in the Solver dialogue box reveals the dialogue box shown in Figure Seven. You can use this to affect how accurate your solution is, how much 'effort' Solver puts into to finding the solution and whether you want to see the results of each iteration.
Figure 7
Figure Seven: Solver Options
The Tolerance option is only required for integer programs (IP), and allows Solver to use 'near integer' values, within the tolerance you specify, and this helps speed up the IP calculations. Checking the Show Iteration Results box allows you to see each step of the calculation, but be warned, if your model is complex this can take an inordinate length of time. Use Automatic Scaling is useful if there is a huge difference in magnitude between your decision variables and the objective value.
The bottom three options, Estimates, Derivatives and Search affect the way Solver approaches finding a basic feasible solution, how Solver finds partial differentials of the objective and constraints, and how Solver decides which way to search for the next iteration. Essentially the options affect how solver uses memory and the number of calculations it makes. For most LP problems, they are best left as the default values.
The 'Save Model' button is very useful, particularly if you save your model as a named scenario. Clicking this button allows you to assign a name to the current values of your variable cells. This option then allows you to perform further 'what-if' analysis on a variety of possible alternative outcomes - very useful for exploring your model in greater detail.
In conclusion, Excel Solver provides a simple, yet effective, medium for allowing users to explore linear programs. It can be used for large problems containing hundreds of variables and constraints, and does these relatively quickly, but as a teaching tool using small illustrative problems it is very potent, particularly as the student must appreciate the structure of a LP when entering it into the spreadsheet.
On the downside, you can't view the Tableau as it is generated at each iteration and so those teachers who would want their students to be proficient in the manual methods of LP would find Solver less superior to, say, Lindo, which does allow this. It does, however, produce a superior set of results and sensitivity reports when compared to Lindo, and, due to the spreadsheet nature, does allow the student very quickly to observe the effects of any changes made to constraints or the objective function.
This is particularly noticeable as the model formulation is easily accessible at the same time as the model results, these being simply placed in adjacent worksheets, accessible with a simple mouse click. Overall, using Excel, which is familiar to a large number of students, provides a rich environment for teaching linear programming and it allows students to explore their models in a structured, yet flexible, way.

References

  1. George B. Dantzig (1963) "Linear Programming and Extensions", Princeton University Press, Princeton, N.J.
  2. Frederick Hillier & Gerald Lieberman (1995) "Introduction to Operations Research", sixth edition, McGraw-Hill
  3. The Cobb Group (1994) "Running Excel 5 for Windows", fourth edition, Microsoft Press
The author may be contacted at the following address:
Ziggy MacDonald, Department of Economics, University of Leicester, University Road, Leicester LE1 7RH, United Kingdom. Telephone +44 (0)116 252 2894, Fax +44 (0) 252 2908, e-mail abm1@le.ac.uk

Friday, July 9, 2010

Multiplying DNA One Drop at a Time

RainDance Technologies says its method of amplifying DNA in drops of water will expand clinical genetic testing.

As the cost of DNA sequencing continues to fall and scientists discover a growing number of genes linked to different diseases, the field of genetic diagnostics is preparing for a boom. Rather than the single-gene tests common today, clinical genetics laboratories are developing tests that simultaneously detect tens or even hundreds of genetic mutations linked to cancer and other diseases, as well as conditions such as mental retardation.
However, with these more complex tests, diagnostic developers need to be able efficiently and accurately select specific portions of the genome for analysis. "Now the cost of sequencing is so cheap you don't have to look at just one or two genes, you can look much more broadly," says Alexis Borisy, entrepreneur in residence at Third Rock Ventures and acting chief executive of Foundation Medicine, a Cambridge, MA-based startup developing genetic tests for analyzing cancer. "But the whole genome or exome [the portion of the genome that codes for proteins] is still too expensive to be clinically useful, so you have to focus the search."
RainDance, a startup based in Lexington, MA, aims to fill that gap with its droplet-based microfluidics technology. Founded in 2004, the company uses picoliter-sized droplets as tiny test tubes to carry out chemical reactions at very small volumes. Precisely sized droplets are created on a microfluidics chip by surrounding aqueous liquid with small volumes of oil. The droplets, generated at a rate of 10 million per hour, can be tightly packed and injected with different reagents, including strands of DNA. To catalyze a reaction, an electrical signal triggers droplets containing different reagents to merge.

Tuesday, June 8, 2010

The best way to discharge a mobile phone battery

to leave it on the phone till the phone switches off.


Nickel Cadmium (NiCd)



All the foregoing applies to NiCd batteries. To get the best life out of a NiCd, let it run down every second or third charge. Do it more often and you shorten its overall life: do it less often, and you risk reducing its charge capacity.

Nickel Metal Hydride (NiMH)



NiMH batteries need much the same care as NiCd, except that you only need to run them down every week or two, if they are charged every night.

Lithium Ion (Li-Ion)



Lithium Ion batteries are very different. You should not deliberately discharge a Li-Ion cell. In fact, if you were to manage to run one flat, it would probably be damaged. There is electronics inside each Li-Ion battery to protect it from such abuse, but don't take the risk!



To keep your Li-Ion battery in good shape, simply charge it overnight before it runs down. If a full battery at all times matters to you, you can top it up whenever you like, but you'll probably get a longer service life from it if you only recharge it when it is getting a bit low.

Storage



Batteries of any type don't like to be left discharged. In general, if you have a spare battery, it is probably best to use it alternately with its partner.

Declining years



Age and infirmity come to all of us, but mobile phone batteries get there quicker than their users!

NiCd



A NiCd battery will lose its charge capacity, and may run flat on its own. This is often caused by sharp, spiky crystals growing through the separators of the cell, causing a short circuit. It is possible to "flash" these away by applying a very high current (such as from a large battery) for a short while. The current through the spike will melt it away, curing the short circuit, but that's not really a cure: the hole in the insulator will still be there, and there will probably be other crystals poised to do the same in another place. If your NiCd battery has managed 700 or more charge cycles, or has been exposed to excessive heat or other abuse, replace it!

NiMH



A tired NiMH battery will probably give good standby times, but as soon as you make or receive a call, you'll discover that it can't provide the current needed. This is because age and heat cause the crystals inside the cell to get bigger, which means that their surface area falls in proportion to their volume. Unfortunately, there is nothing much you can do about this. If a NiMH battery has managed 500 or more charge cycles, it has done well. Time for a replacement!



Li-Ion



Li-Ion batteries can fail suddenly, possibly because the electronics inside it have gone wrong, but in general they simply fade away. Because the capacity falls gradually over the charge cycle life, when to replace it is a matter of when the charge capacity is no longer sufficient for your needs. Never try to revitalise a Li-Ion battery in any way, or expose it to excessive heat: the very high power density of Li-Ion makes such actions very dangerous.

Replacement



Because of the subsidy system, it is often cheaper to upgrade to a new model of phone (complete with new battery) than it is to buy a new battery. Having said that, it really is worth replacing a worn-out battery. It is common for people to remark that they wish they'd bought a new battery sooner - putting it off is rarely wise!

Disposal



When it is time to say good bye to an old, tired, battery, don't throw it on a fire: it could explode. Don't put it in your dustbin: there should be facilities for recycling rechargeable batteries provided by your local council.

NTT DoCoMo begins testing LTE network

Japan's NTT DoCoMo began testing on Tuesday of a new cellular data network that should ensure Tokyo remains one of the fastest places on the planet to send and receive data via cell phone.




The new network is scheduled to go into operation in December this year and should initially deliver upload speeds of up to 25M bps (bits per second) and downloads of 75Mbps. The speeds are respectively 5 and 10 times faster than NTT DoCoMo's current fastest service.



The new network is based on a technology called LTE (Long-Term Evolution), an IP-based system seen as a replacement for 3G-based HSPA (High Speed Packet Access). Its introduction will not only mean faster data transfers but could also reduce the per-byte cost of data communications.



NTT DoCoMo began building the LTE network in December last year and the tests that began on Tuesday will verify the network for speed, latency, stability of inter-cell handover and other factors important to a commercial service.



The December launch will be for a data communications service and DoCoMo plans to begin selling its first LTE-compatible handsets in 2011. Initial service will be restricted to Tokyo but 50 percent of populated areas are expected to be covered by 2014.



The network is being built with an investment of between ¥300 billion and ¥400 billion (US$3.3 billion to $4.4 billion) during the first five years of the roll-out, and comes with an eye on the future.



Eventually NTT DoCoMo expects to offer even faster speeds via the new network. Future upgrades will push download speeds as fast as 300Mbps and upload speeds to 75Mbps.

Sunday, May 30, 2010

NPT session approves steps on nuclear-free Middle East

nited Nations: A landmark conference to curb the spread of nuclear weapons agreed here Friday on talks toward the establishment of a nuclear weapon-free zone in the Middle East. It was the first agreement in a decade on the nuclear Non-Proliferation Treaty (NPT), which since 1970 has set the global agenda for keeping countries from getting the...

Saturday, May 29, 2010

July Workshop Devoted to Improving Usability of Health Care IT

July Workshop Devoted to Improving Usability of Health Care IT


For Immediate Release: May 25, 2010



Contact: Ben Stein

301-975-3097





Improving the ease of use of information systems for the health care industry could significantly facilitate the adoption of technology that has great potential to improve the quality of health care while reducing costs. The goal is to allow medical professionals interact with health care information technology quickly and easily to support their primary tasks rather than complicate them. Along the way, steps must be taken to ensure that health IT systems are accessible to people with disabilities.



Toward these ends, the National Institute of Standards and Technology (NIST) will be hosting a one-day workshop on improving the usability—ease of use—of health IT. Co-sponsored by the Department of Health and Human Services Office of the National Coordinator and the Agency for Healthcare Research and Quality, the workshop will take place on July, 13, 2010, at the NIST Gaithersburg campus.



The goal of the workshop is to promote collaboration in health IT usability among federal agencies, industry, academia and others. Attendees will discuss ways to prioritize, align and coordinate short-, medium-, and long-term strategies and tactics to improve the usability of electronic health records (EHRs). Specific objectives of the workshop will be to establish an immediate-term set of actions to inform the national initiative to drive the adoption and meaningful use of EHRs; develop a strategic approach to measure and assess the use of EHRs and the impact of usability on their adoption and innovation; develop strategies to drive best practices and innovation in health care IT; and inspire follow-on activities in health care IT usability.

Friday, May 28, 2010

PCI tokenization guidance could benefit payment processors

In a recent interview, Bob Russo, general manager of the PCI SSC said he didn't expect any major changes to the data security standards (PCI DSS), which is undergoing a revision this year. But guidance documents are being developed to help merchants decide whether investing in encryption or PCI tokenization technologies is a wise move.



"We're creating a framework right now where we map these technologies out and lay them next to the standards, so if somebody is using one of these technologies, [the framework] will let them know if they would satisfy certain requirements," Russo said.



Some payment processors and encryption vendors have rallied around a mixture of encryption and tokenization software to protect card data at the time a customer swipes their credit card at a payment terminal. RSA, the security division of EMC Corp. is working with First Data to provide tokenization technology in the encryption services that First Data sells to merchants. Voltage Security Inc. sells an encryption combined with tokenization and is working closely with Heartland Payment Systems to provide encryption and tokenization services.