The reason why real feelings and mood changes can be seen through our eyes is that the eyes provide the most revealing and accurate information of all human communication signs. It is possible to control a human-computer interface by voluntarily moving the eyes, which have an important place in communication. In this study, the appropriate feature and classification methods were investigated to use the Electooculography signs obtained from seven different voluntary eye movements in the human-computer interface. The success of the system is increased by determining the combination that gives the best result from many features by using the sequential forward feature selection method. The developed method reached 93.9% success in the seven-class dataset. The results show that human-computer interface control can be done with high accuracy with voluntary eye movements. Also, the development of a real-time working model is inspiring for work.