Advanced calculus is also advanced pre-calculus, and certain points from pre-calculus arise all the time in this course, and in higher mathematics in general. Here are some of them. (This list is not exhaustive).
You should be able to sketch curves such as ellipses and hyperbolas:
$$
\frac {(x - x_0)^2}{a^2} + \frac{(y - y_0)^2}{b^2} = r^2
$$
$$
\frac {(x - x_0)^2}{a^2} - \frac{(y-y_0)^2}{b^2} = r^2
$$
You should know by heart the most basic basic trigonometric identities: not only
$$
\sin^2 \theta+ \cos^2\theta = 1,
$$
but also
$$
\cos^2 \theta = \frac 12( 1+\cos 2\theta),
\qquad
\sin^2 \theta = \frac 12( 1-\cos 2\theta).
$$
These are easy to remember if you have a good mental picture of
what $\cos^2$ or $\sin^2$ looks like.
You should be able to derive more or less immediately other
identites that follow directly from the above, such as
$
\sec^2\theta - \tan^2\theta = 1
$.
basic properties of exponentials and logarithms.
2. Linear Algebra.
The following topics should be familiar to students from earlier courses
such as MAT223.
dot product:
the definition of the dot product $\bf v \cdot \bf w$, for vectors $\bf v, \bf w$.
$|{\bf v}| = \sqrt { \bf v \cdot\bf v} = \mbox{the Euclidean norm of }\bf v$.
$\bf v \cdot \bf w = |\bf v| \ |\bf w| \cos \theta$, where $\theta$ is the angle between
the two vectors.
*in particular, $\bf v$ and $\bf w$ are orthogonal if and only if ${\bf v\cdot \bf w} = 0$.
linear dependence and independence.
matrix-matrix and matrix-vector multiplication.
Connection between $m\times n$ matrices and linear mappings $\mathbb R^n \to \mathbb R^m$.
How to visualize linear mappings, particularly $\mathbb R^2 \to \mathbb R^2$.
How to solve the equation $A\bf x = b$ by Gaussian elimination (when a solution exists).
Supose that $A$ is a $m\times n$ matrix.
If the $n$ columns of $A$ are linearly independent (which can only happen if $n \le m$), then
$$
\{ A {\bf x} : {\bf x}\in \R^n \} \mbox{ is a $n$-dimensional subspace of }\R^m.
$$
In fact it is the subspace consisting of all linear combinations of columns of $A$.
If the $m$ rows of $A$ are linearly independent (which can only happen if $m\le n$), then
$$
\{ {\bf x}\in \R^n : A {\bf x}= {\bf 0} \} \mbox{ is a $n-m$-dimensional subspace of }\R^n.
$$
It is the subspace of vectors that are orthogonal to all the rows of $A$.
the determinant and related topics.
(Note, when speaking about determinants, we always implicitly assume that we are referring to square matrices.)
the definition of the determinant, and how to compute it. (In fact, once you know any method of computing the determinant, you can consider that to be the definition of the determinant.)
The matrix $A$ is invertible if and only if $\det A\ne 0$.
The equation $A\bf x = b$ has a unique solution for every $\bf b$, if and only if $\det A\ne 0$.
The rows of a matrix $A$ are linearly independent if and only if $\det A \ne 0$. Similarly, the columns are linearly independent if and only if $\det A \ne 0$.
Advanced properties of the determinant. (These may not all be familiar).
the determinant is alternating. This means that if we swap two columns of a matrix, the sign of the determinant changes. Similarly if we swap two rows.
The determinant is also multilinear. This means that if we fix all of the rows except for the $j$th row, then the determinant depends linearly on the $j$th row.
If $f$ is any real-valued function whose domain is the set of all $n\times n$ matrices, and if $f$ is alternating and multilinear, and if $f(I)=1$, (where $I$ denotes the identity matrix) then $f(A)=\det(A)$ for all matrices $A$.
eigenvalues and eigenvectors: what they are, and how to find them.
If $A$ is a $n\times n$ matrix, then a number $\lambda$ is an
eigenvalue if and only if it satisfies the equation $\det (A-\lambda I)=0$, where $I$ denotes the identity matrix. This equation is an $n$th-order polynomial in the variable $\lambda$.
Once you have identified $\lambda$ as an eigenvalue then you can find a (nonzero) eigenvector $\bf v$ by solving $(A- \lambda I) \bf v = \bf 0$, perhaps by Gaussian elimination. (The solution is
never unique.)
3. Calculus.
The following topics should be familiar to students from earlier courses
such as MAT137.
The use of quantifiers such as $\forall, \exists$.
properties of the real numbers:
every nonempty, bounded set that is bounded above has a least upper bound (or supremum). Similarly, every nonempty, bounded set that is bounded below has a greatest lower bound (or infimum).
You should be able to carry out easy proofs using the notions of supremum and infimum.
every bounded, monotone sequence has a limit.
Limits and Continuity
definition of $\lim_{x\to a} f(x)$; definition of a continuous function.
ability to carry out $\delta-\epsilon$ proofs using these definitions.
all elementary functions (polynomials, exponential and log, trigonometric functions
and their inverses) are continuous on their domains.
properties of limits: sums and products of limts, squeeze theorem, etc.
the Intermediate Value Theorem,
Differentiation
definition of the derivative.
Rolle's Theorem, Mean Value Theorem.
chain rule, product rule.
implicit differentiation.
derivatives of elementary functions.
l'Hôpital's Rule.
applications, for example to curve sketching and to optimization problems.
Taylor's Theorem, with formulas for the remainder.
Integration and the Fundamental Theorem of Calculus
definition of the definite integral.
piecewise continuous functions are integrable.
the Fundamental Theorem of Calculus.
integration by parts.
substitution.
Sequences, Series, Convergence Tests
4. Other.
Basics of personal hygiene - shower or bathe regularly, etc - were covered in MAT137, as well as earlier in your educational career.