Nginx proxy with prefix

| Comments

It can be done this way.

    location /my_app {
        rewrite ^/also/?(.*)$ /$1 break;
        proxy_pass http://my_app_upstream;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        ...
    }

Without rewrite it will pass /my_app to upstream, so prefix should be handled on my_app level.

    location /my_app {
        proxy_pass http://my_app_upstream;

And with / in upstream it will add an additional /, so urls in proxied service will look like //some-url

    location /my_app {
        proxy_pass http://my_app_upstream/;

How to connect to a service in docker swarm with docker

| Comments

It’s a rare case, suppose there is a myapp swarm cluster with myapp_mongo database without published port and there is a need to run a command from some docker image with connection to this database.

By default docker stack deploy creates a non-attachable network, so docker run --network myapp_default will output Error response from daemon: Could not attach to network myapp_default: rpc error: code = PermissionDenied desc = network myapp_default not manually attachable.

A way to bypass it is to create a new attachable network and attach it to the service.

docker network create --driver overlay --attachable mongo_network
docker service update --network-add mongo_network myapp_mongo
docker run --rm --network mongo_network mongo:4.0.6 mongodump -h lsicloud_mongo ...
docker service update --network-rm mongo_network lsicloud_mongo
docker network rm mongo_network

Get blinker signal's receivers names

| Comments
from unittest import TestCase
from app.main import init_app
from app.signals import my_signal


class SignalTests(TestCase):

    def setUp(self):
        init_app()

    def test_signal_is_connected_to_my_receiver(self)
        receivers = [r().__name__ for r in signal.receivers.values()]
        self.assertIn('my_receiver', receivers)

MongoDB dump and restore with docker

| Comments

Here are two commands to take a partial dump of the collection from production database and put it in dev mongo instance running through docker-compose.

docker run -v `pwd`/:/dump mongo mongodump --gzip --archive=/dump/my_collection.agz --host <connection url> --ssl --username <username> --password <password> --authenticationDatabase admin --db <prod_db> --collection my_collection --query '{date: {$gte:  ISODate("2019-02-01T00:00:00.000+0000")}}'

docker-compose run -v `pwd`/my_collection.agz:/my_collection.agz mongo mongorestore --gzip --archive=/my_collection.agz --host mongo --nsFrom <prod_db>.my_collection --nsTo <dev_db>.my_collection

Easy charts in python app with plotly

| Comments

An example how to add a chart to Flask app with plotly library:

def get_chart_data():
    return mongo_db['item'].aggregate([
        {'$sort': {'date': 1}},
        {'$group': {
            '_id': {'year': {'$year': '$date'}, 'month': {'$month': '$date'}},
            'num': {'$sum': 1}
        }},
        {'$project': {
            'date': {'$dateFromParts' : {'year': '$_id.year', 'month': '$_id.month'}},
            'num': '$num'
        }},
        {'$sort': {'date': 1}}
    ])


def get_chart():
    data = list(get_chart_data())
    layout = plotly.graph_objs.Layout(title='Items by month')
    scatter_data = [
        plotly.graph_objs.Scatter(
            x=[d['date'] for d in data], 
            y=[d['num'] for d in data]
        )
    ]
    fig = plotly.graph_objs.Figure(data=scatter_data, layout=layout)
    return plotly.offline.plot(fig, include_plotlyjs=True, output_type='div')


@route('/chart')
def chart():
    return render_template('chart.html', chart=get_chart())

Jinja template:

{{ chart|safe }}

For jupyter notebook it will be:

from plotly.offline import init_notebook_mode, iplot
init_notebook_mode(connected=True)
iplot(get_chart())

Mongoengine as_pymongo performance

| Comments

When you need to get only several fields from the list of complex objects, it works much faster with as_pymongo function.

In my situation I have a 16x increase:

%timeit list(SomeObject.objects.scalar('id')[0:500])
# 129 ms ± 11 ms per loop

%timeit list(o['_id'] for o in SomeObject.objects.scalar('id')[0:500].as_pymongo())
# 7.98 ms ± 849 µs per loop

How to find the document with maximum size in MongoDB collection

| Comments

The command to get document size is Object.bsonsize. The next query is to find the document in a small collection, cause it can be slow:

db.getCollection('my_collection').find({}).map(doc => {
    return {_id: doc._id, size: Object.bsonsize(doc)};
}).reduce((a, b) => a.size > b.size ? a : b)

To do this faster with mongo mapReduce:

db.getCollection('my_collection').mapReduce(
    function() {
        emit('size', {_id: this._id, size: Object.bsonsize(this)});
    },
    function(key, values) {
        return values.reduce((a, b) => a.size > b.size ? a : b);
    },
    {out: {inline: 1}}
)
1/10 »