Department of Statistics, Virginia Tech" output: html_document --- ```{r, echo=FALSE} options(width=80) library(knitr) knit_hooks$set(no.main = function(before, options, envir) { if (before) par(mar = c(4.1, 4.1, 1.1, 1.1)) # smaller margin on top }) ``` ### Problem 3: Optimization (34pts) *Write an R function returning* $$ f(x) = 1 - x + 3x^2 - x^3, \quad x \in [-0.5, 2.5]. $$ The code below defines the function in R. ```{r} f <- function(x) { 1 - x + 3 * x^2 - x^3 } ``` *a. Plot the function over that range and note the critical points. Check them against the truth (via calculus).* From calculus, the derivative of the function is $$ \frac{d}{dx} f(x) = -1 + 6x - 3x^2 $$ and we can find the zeros of that function to determine the critical points. The quadratic formula says ```{r} sols <- (-6 + c(-1,1)*sqrt(6^2 - 4 *(-1)*(-3))) / (2*(-3)) sols ``` The following code plots a visual of the function over the input space, agumented with vertical bars at the critical points. ```{r, dev.args = list(bg = 'transparent'), fig.width=5, fig.height=4, fig.align="center", no.main=TRUE} x <- seq(-0.5, 2.5, length=1000) plot(x, f(x), type="l") abline(v=sols, col=2, lty=2) legend("top", c("f(x)", "crits"), col=1:2, lty=1:2, bty="n") ``` *b. Use an R library function to find those critical points numerically, and check them against the plot/truth.* The code below funds the minimum, which is in the left half of the space, by minimizing the function in an appropriate range. ```{r} optimize(f, lower=-.5, upper=1) ``` - That seems to agree with our Calculus. Next, the code below optimize (this time maximizing) the function over the right half of the input domain. ```{r} optimize(f, lower=1, upper=2.5, maximum=TRUE) ``` *c. How many iterations (as counted by the number of evaluations of your R function) did it take to find each critical point?* Counting the number of iterations will require modifying the function a little bit to implement a (global) counter. ```{r} f2 <- function(x) { cnt <<- cnt + 1 1 - x + 3 * x^2 - x^3 } ``` Now, we can initialize the counter, re-run the optimizations, and report the final count. First the minimum. ```{r} cnt <- 0 null <- optimize(f2, lower=-0.5, upper=1) cnt ``` Then the maximum. ```{r} cnt <- 0 optimize(f2, lower=1, upper=2.5, maximum=TRUE) cnt ``` Apparently, both take ten iterations. Another way to do this is is by finding the roots of the derivative functions. Here is the derivative as a function in R, modified to include a counter. ```{r} df2 <- function(x) { cnt <<- cnt + 1 6*x - 1 - 3*x^2 } ``` First the minimum. ```{r} cnt <- 0 uniroot(df2, interval=c(-0.5, 1)) ``` Looks good. the maximum, after saving the counter. ```{r} cnt.min <- cnt cnt <- 0 uniroot(df2, interval=c(1, 2.5)) ``` Also good. How does the number of iterations compare? ```{r} c(cnt.min, cnt) ``` - A smidge better! - Notice how the reported number of iterations differs from the number of actual evaluations.