Third week as a software developer intern
This third week was a lot of learning. Not gonna lie read about physics topics took me by surprise, but it was a great moment of reflection, which made me think about how lucky we are to belong in this little blue point called Earth. Also, I had a brief exposure to quantum mechanics, which although it was a short introduction, I understood the most basic concepts of this discipline and the impact it will have in the future.
All this revolution of thought in physics comes from the double-slit experiment, in which by passing a beam of electrons through two slits it would be expected to measure two sets of electrons corresponding to each slit on the other side. However, the result is counterintuitive, at least for the ideas of the time (still), and is because we find a probability distribution reminiscent of an interference pattern, which occurs when two waves come together. The thing does not end there, if we measure where the electron passes, we no longer obtain an interference pattern, we obtain two sets of electrons on the other side as if this time they behaved like particles. This is what is known as wave-particle duality.
This behavior will be the key to the new revolution in technology that is coming with the arrival of quantum computers, mainly because the advance in processing power in current computers has gone hand in hand with the reduction in the size of transistors. However, we will reach a point where the size of these transistors can no longer be reduced further since the physical properties on these size scales are not as predictable as on the macroscopic scale. Transistors allow the creation of logic gates that are the basis of Boolean algebra that allows operating with bits, these bits can represent two states (0 = off, 1 = on) and the combination of bits allows to store increasingly complex information. Well, in quantum computers, there are qubits, or quantum bits, which take into account the properties of quantum physics, allowing each qubit to have probability values between these two states (0 and 1), achieving that few qubits can store the same information as with many bits. This will have a great impact in areas such as optimization, encryption, simulation, and databases.
I also had the opportunity to learn about new methodologies in thinking. Many of them have been applied in Google and it is said, is responsible in large part for their success, because it allows them to generate such innovative ideas. One of these philosophies is moonshot thinking. This starts under the premise of “it’s easier to make something 10x better than 10% better” and this is because the challenge of achieving something 10 times better is more interesting. But, for this to work, it is necessary to have a work environment that not only reward good results but also good methodologies, one environment that invites you to take risks. In addition, for this philosophy to work, we must fail quickly, be critical and not cling to a project that we do not see in the future.
Being within this internship, in which many colleagues do not have a background in computer science, caught my attention. Now I understand a little more why, and it is the philosophy of “what get you here won’t get you there”. We cannot live on achievements from the past if we want to keep growing, we must be constantly learning always trying to be proactive. So it doesn’t matter what your area is, but how quickly you learn and adapt to work.
Another skill that I want to improve is the quality of my presentations, where the slides have much more impact than we think. That is why we must dedicate time to them. Our presentations must be able to make the people think. In order to succeed, we must give the audience sequential information that is easy to follow, explain why it is useful, and simplify it. We must also be able to make the audience feel, encouraging their participation.
Programming has reached a point of maturity, where innovations go along the same line of thinking. And this starts from the fact that we consider programming as a dogma, where we follow a methodology that we have been taught, and our innovations start from there. But in its beginnings, people did not follow a dogma they did not even know what programming was or how it should be, this allowed that there was no prejudice in the new ideas, encouraging ideas such as GUIs to program.
To this day, programming has continued with the same logic as the one from 100 years ago, this means that we are limited with the same way of thinking, we continue to carry their limitations of that time, in which, for example, they did not have a display, or they only knew how to work in a serial manner.
About software development, I had the opportunity to experience first-hand with TDD (test-driven development), where you define functions (tests) that will allow you to corroborate the success of your program. Viewing each of these tests as tasks allows the development of a more compact code, which helps to understand the limitations of the program and provides a more solid structure so that it can be reused. The principle of TDD is “write a test that fails, make the code work, eliminate redundancy”.
When working with TDD, we must consider that the tests are:
- low level (focused on small parts).
- written by the coders.
Unit tests usually work in a social manner, that is, they depend on other unit tests to work. They must be fast since programmers run unit tests after any change of the code. Finally, you must have a commit suite, which contains all the unit tests, to run this function to verify before committing.
Using TDD has the following advantages:
- Tests help find bugs
- concrete code (if you want a more complex behavior, you should add more tests)
And as disadvantages:
- It could satisfy the test, but not the real requirements.
I got to know some testing frameworks like X unit.
I understood the basics of Machine Learning (ML), using the scikit-learn library and the popular tensorflow framework. ML is a subfield of Artificial Intelligence, where the computer is expected to learn for examples and experience instead of hard-coded rules. The examples shown belong to the area of supervised learning, where the methodology consists of the following:
- Import the dataset
- separate the dataset in training data and testing data
- train the model with the training data
- evaluate the performance of the classifier with the testing data
- make predictions
Finally, I was learning and practicing to use git, which is a version control tool, that allows us to save different versions of our code, that have been saved at different times, and also allows us to work collaboratively, by allowing different users can create their versions of our work, and if in the future, some characteristics are of interest to us, allow us to unite our projects.
At the time of commit, I was finally able to meet Vim, which is a text editor that works in the terminal, for a moment I felt like a 10x developer. Vim was created under the premise of always being efficient, allowing to work with files very quickly. I am sure that in the future I will give it a try, but for now, I will continue using the good old vscode.
This is it for this week, until next time!