Quantum computing is very different from the classical computing we are all used to. I realized that the day I decided to learn the basics of this amazing field and write a simple Hello World! program. It is not just about printing a string on the screen, but about using the properties of Quantum Mechanics to solve a particular problem. I needed to come up with creative problem to solve!
Some months ago I started working at MongoDB as Technical Services Engineer. It was a big and exciting change for me after
several years in the relational world of MySQL. During that time a large percentage of cases from customers were
related to bad query plans/wrong index selection. With MongoDB I found that the number of cases on that topic was
very very (very) low. So I asked myself... why?
Facebook recently released a forecasting tool called Prophet. Prophet can forecast a particular metric in which we have an
interest. It works by fitting time-series data to get a prediction of how that metric will look in the future. In
this blog post, we'll look at how Prophet can forecast metrics.
Let's admit it, the task of monitoring services is one of the most difficult. It is time-consuming, error-prone and difficult
to automate. The usual monitoring approach has been pretty straightforward in the last few years: setup a service
like Nagios, or pay money to get a cloud-based monitoring tool. Then choose the metrics you are interested in and
set the thresholds. This is a manual process that works when you have a small number of services and servers, and
you know exactly how they behave and what you should monitor. These days, we have hundred of servers with thousands
of services sending us millions of metrics. That is the first problem: the manual approach to configuration doesn't
work.
Batch Normalization is not a new technique recently discovered. It was described back in 1998 by Yan LeCun, et al in the
whitepaper Efficient Backdrop. But I am the kind of person that prefers visual and straighforward examples that show
how the theory applies in real life. It makes things easier to remember, don't you think?
Docker CLI tool has many commands. If you are also managing a large number of containers, images and networks you will find
yourself going back and forth to remember all names, identifiers and parameters. To get some help in this task we
can use bash-autocompletion so it does all the magic for us when we push the "tab" key.
Serverless. That is the new magic word. Forget about containers, virtualization, operating systems, patching, scaling...
Just write the code and upload it to Amazon Lambda.
Sometimes I need to upload really large .zip files to S3. aws cli or the web interface works perfect for this task. The problem
is, at least in my case, the bad quality of the internet connection. If for some reason it goes down I need to start
the upload again from the very beginning.