The present dissertation deals with the deduction of necessary optimality conditions for multicriterial optimization problems. Mainly, the concept of so-called derived sets is applied. Derived sets were introduced about 1965 by Hestenes for scalar optimization problems. In fact, derived sets represent a quite generalized term of a derivative. In 1994, Breckner translated this term into the multiobjective case. By using derived sets, necessary optimality conditions can be given in form of Langrange multiplier rules. First of all, the dissertation presents an introduction in this theory. This way, the reader gets a survey of the most important structural properties of the terms introduced so. In the following, there are given some results concerning the existence of derived sets as well as the construction possibilities of these sets in some cases. Connecting with this, some relationship between well-known concepts of derivation and derived set becomes evidently. By mean of Ekeland‘s variational principle, an optimality condition for approximate solutions of the considered multicriterial optimization problem analogously to Breckner‘s multiplier rule is deduced. For to do so, a vector-valued version of the variational principle is applied. The deduction makes use of some additional assumptions on the appearing derived sets, however it is shown these assumptions being not stringent but fulfilled in many application cases. The final item is dedicated to the regularity conditions. Looking at the regularity condition appearing in Hestenes’ paper, an analogon for Breckner‘s multiplier rule is derived. The condition received so is shown to be sufficient as well as necessary. In special cases, this condition is compared with the constraint qualifications applied usually as regularity conditions. From the theory of cognition, the assertion is made constraint qualifications being structurally caused too hard, what implies that these conditions do not admit a sharp distinction between regular and irregular problem data.