When you type in a symbol, it is not stored as that symbol (including zero and one, for some unknown reason). It would instead be stored as a combination of 7 binary characters, also known as bits. There are a total of 128 possible combinations - capable of storing the English alphabet, numerals, punctuation, and a few others, but incapable of storing every alphabet and specialized symbol. The particular code that chooses which sequence stands for which symbol is called ASCII, meaning "American Standard Code for Information Interchange."
To do this, programmers designed UTF-8, which consists of both ASCII and another encoding scheme - Unicode. Unicode utilizes from 1 to 4 bytes (a byte is a sequence of 8 binary characters) to store an individual character - potentially allowing over 4 billion characters to be represented by a unique sequence of binary characters.
This whole process works with an underlying theme in computer science and programming - abstraction. Abstraction, in this context is using something that has no relation to your subject and using it to represent the subject, allowing you to communicate, via a code (this is probably where the name originated), with a computer's binary vocabulary. (Puns. Too many puns.) Abstraction is used because computers work in a way that is traditionally referred to along the lines of "strong but dumb." What this means is that they cannot do anything unless it is explicitly and completely described to them, but they do it extremely quickly.
Now, getting back to Symbols. There are several ways to place a symbol in your text, but they are highly dependent on your system, application, language (this refers to something in the coding), and code page. For example, in Microsoft Word, you type out the hexadecimal code for the given symbol, then press Alt-X. With a little research, you can easily find a method that works in your particular circumstances, as well as the code for whatever character you want to place.