Content All Articles Python News Numerically Python Python & XML Python Topics CommunityDatabaseDistributedEducationGetting StartedGraphicsInternetOSProgrammingScientificToolsTutorialsUser Interfaces ONLamp Subjects Linux Apache MySQL Perl PHP Python BSD

Numerical Python Basics
Pages: 1, 2, 3

### The End Game

With these abilities in hand, what can you do? In another flashback to math class, the topic of simultaneous linear equations (also known as solving multiple equations with multiple unknowns) comes to mind. In the following example, the goal is to find the values x and y that make both equations true (at the same time, of course).

This is the classic problem of finding the place where two lines intersect in the plane. The lines may represent different phenomena, perhaps profit vs. cost in economics or growth vs. decay in biology. In any case, solving for the intersection can be tedious. Add a few more variables, and it can be very difficult.

You can generate the answer by hand or with a computer. The trick to answering the problem with matrices is to represent the equations as two 2-by-1 matrices that are equal. Note how we use matrix shape, multiplication, and inverses to find our answer:

This can be rewritten as a product of a 2-by-2 matrix with a 1-by-2 matrix (the unknowns) being equal to a 2-by-1 matrix:

Matrix mathematics has a few more rules than traditional algebra does. One of those is seen in the equation below. In order to solve for the unknowns, z needs to be isolated on the left. Thinking algebraically, your first instinct might be to divide through by A. But matrix division is not defined! Fortunately, we can use the inverse. Multiply both sides by the inverse (just like you might multiply both sides by 1/A).

On the left the result is the identity matrix multiplied by z (which is just z). The inverse cancels out matrix A. On the right we are left with the product of A-1b. Since all values are known on the right and only unknowns exist on the left, a solution is at hand! These equations are solved with just an inverse and a multiply.

Want to calculate that inverse and multiply by hand? I don't. Here is how you do it with NumPy:

>>> A=array(((3,4),(5,2)))
>>> b=reshape(array((11,9)),(2,1))

Notice how we generate b in one fell swoop using a combination of the array() and reshape() functions.

>>> Ainv = inverse(A)
>>> z=matrixmultiply(Ainv,b)
>>> z
array([[ 1.],
[ 2.]])

which is the answer. Checking the result with the original matrix A,

>>> matrixmultiply(A,z)
array([[ 11.],
[  9.]])

which shows the right answer! Hopefully the exercise was less painful than solving things by hand. The power of Python/NumPy is only hinted at by this simple problem. The true power is in solving problems that are larger. If the numbers of unknowns and equations equaled 10, it would be difficult to solve the problem by hand.

I haven't even touched on DISLIN. I am saving that for the next article. Visualization can be very useful. Remember all that graphing of equations? DISLIN can do it for you. In the following plot, the green and red lines are those of the equations above. They intersect at the point x=1 and y=2, which is exactly the answer we got.

### More to Come

In upcoming articles we will explore more useful matrix manipulations with NumPy. We will also graph the results with DISLIN to help us see what it is we are doing. You may find the 2D graph above uninspiring, but check out this graph of 3D information. This graphs the response of a control system (perhaps the anti-lock braking system in a car) at points of instability. The peaks actually rise to infinity and were clipped for plotting purposes.

Want to launch ahead with Numerical Python? Are you struggling to remember that math? Hungry for more? While you are waiting for next month's article, here is where you can find out more. Read the Numerical Python Manual. It was written as a tutorial as well as a reference.

Want to buy some books on linear algebra and matrices? The book Applied Linear Algebra by Ben Noble and James W. Daniel is an excellent read. If your interests lie more in the computational aspect of matrices, the ever popular Matrix Computations by Gene H. Golub and Charles F. Van Loan should also be on your reading list.

Eric Hagemann specializes in fast algorithms for crunching numbers on all varieties of computers from embedded to mainframe.