This paper is about solving an optimization problem for a sparse solution. Given a matrix A and a vector b, the optimization problem is to solve the linear equation Ax = b for x. The problem can be represented as a minimization problem where we minimize the norm of x subject to Ax = b. By using L0- regularization, defined as the number of non-zero components, the problem becomes an NP-hard problem. Therefore we do a convex relaxation by using the L1- norm. With this relaxation we are able to find a sparse solution, a solution with few non-zero entries. The advantage with sparse vectors is that the computations take less time. Another advantage is that sparse vectors require less space when being stored, since we only need to know the position and the value of the entries. The problem is used in Compressed Sensing and in many other applications. Compressed Sensing is about representing signals by using few non-zero coefficients in a basis. For the system we are solving to have sparse recovery, it requires conditions such as the Null Space Condition, the Spark, the Restricted Isometry Property and the Coherence. There are many methods for solving this problem, and in this paper Basis Pursuit and Greedy algorithms will be presented.