Jump to content

User:SigmaJargon/Math5610Assignment4

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 155.97.201.118 (talk) at 13:19, 3 December 2007 (3.) Linear Programming). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

1.) Spline vs. Cubic Hermite

a.) Data:




The coefficients of the piecewise cubic function on the first interval (0-1) are found by solving:

And on the second interval (1-2):

b.) This becomes a cubic spline if the second derivative is continuous. Since we only have two intervals, this will hold if the second derivative for both equations is equal at 1. That is, if:

2.) More on Cubic Splines

Data:

a.) I'm not sure I should even bother solving the system of equations to find the coefficients - it's pretty obvious that is what we're looking for.

b.)

which yields the polynomial:

c.)

3.) Linear Programming

At first glance, I'd've said no. First of all, Meg has less than a shining record in terms of trying to squeeze money out of me for bogus solutions, and second, the claim that the speed is independent of the size of the problem is indeed stunning.

However, it never hurts to give her the benefit of the doubt. Let's work through it a bit and see how we feel after some analysis.

First off, on the transformation of an inequality constraint to an equality constraint: if we require that , then it follows that for some "slack" vector . A sign constraint is really just an inequality constraint, and can be treated in a similar way.


4.) Polynomial Interpolation

5.) Runge Interpolation and 6.) Judicious Interpolation

Uploading the 40 images for these two questions would be a headache and a chore. Therefore I provide 4 images - the interpolations at n=10 and n=20. If you'd like to generate more, then you can use these Maple worksheets I created:


Runge Interpolation

Judicious Interpolation


10th degree Runge

File:Runge10.jpg


20th degree Runge

File:Runge20.jpg


10th degree Judicious

File:Judicious10.jpg


20th degree Judicious

File:Judicious20.jpg

7.) Interpolation of Symmetric Data is Symmetric

We're interpolating over 2n+1 points, so we'll need a polynomial of degree 2n+1. Since n is a natural number, we then have that the interpolating polynomial is of odd degree.

For the sake of contradiction, let's say that the interpolate isn't odd. Then there is some point along the curve so that . cannot be any of the interpolating points, since . We can define by rotating 180 degrees - that is, . All of the interpolating points lie along this polynomial, since for all i. However, since , the point , so and are different polynomials. However, the interpolating polynomial is unique! Thus we have a contradiction, and it is proved that is odd.

8.) Linear Independence of Bernstein-Bezier Basis Functions

The first and most important thing to note is that for Bersteing-Bezier basis functions is that if you expand the , you always end up with a 1, plus a bunch of higher order terms. If you then multiply with , and so on.

Then consider . The 0th basis function contains a non-zero term of order 0 (that is, constant). However, it is the last and thus the only basis function to contain such a term, and so since the sum of all basis functions is zero we conclude .

We then assume the , and then seek a proof that . Consider the kth basis function. It is the last basis function to contain a term of order k. In the sum of basis functions, all basis functions less than the kth are 0, since . So the kth basis function is the only non-zero function contributing a term of order k to the sum. However, since the sum equals 0, we conclude that .

Thus, by induction, for all i.

9.) Uniqueness of Interpolating Polynomial

a.) Power Form

This yield , , , and . So .


b.) Lagrange Form

So . Substituting and simplifying everything down leaves us with:


c.) Newton Form







So, . If we expand all the terms and then simplify, we arrive at:

10.) The Method of Undetermined Coefficients

We expect that since we have four points, we'll at least have accuracy up to cubic functions. We then want to check if we can find values for such that the we have exactness for quartic polynomials.

I claim that there are no that fulfill the conditions. To show this, consider this counter-example:

Let . In order to find , let's throw four specific functions at the equality, and see what turn up.


. Plugging in the four functions above yields:


Or, in matrix form:


However, if we consider a slightly different set of functions

And try to find , we get:

Which is different! So we can't find a solution for that works for all quartic functions. And if we can't find working weights for quartic function we aren't going to find them for higher order polynomials.

So! We now want to find the for cubic functions!

To do so, we use the procedure described in the 10/25 lecture.


11.) Error Analysis

12.) The Method of Undetermined Coefficients, Continued

Wait wait wait. h has no relation to the point we're considering. So it is arbitrary, another number we can also pick so as to maximize our polynomial degree. Since this is so, it quickly falls out that this formula is accurate for polynomials of degree as high as we wish: simply set . Then we have . Of course, you should see where this is going - simply take a small h, or in general the limit as h goes to 0, and we have that . This we know to be true, as it is the definition of the derivative.