question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Modifying a query column when fetching

See original GitHub issue

I’m using knex-postgis to try and convert an encoded geometry table attribute to readable text. I’m trying to modify the query on the ‘fetching’ event, but it seems the modifications are ignored.

my code:

const bookshelf = require('./.database');
const st = require('knex-postgis')(bookshelf.knex);

const Events = bookshelf.Model.extend({
    initialize: function() {
        this.on('fetching', this._locationToText);
    },
    _locationToText: function(model, columns, options) {
        options.query.select('id', st.asText('location')).from(model);
        console.log(options.query.toString());
    }
})

This logs the expect query string to the terminal, but the fetch request still returns all attributes and the location attribute still encoded in the unreadable format. Is it possible to modify the SELECT part of the query? Am I doing something wrong?

Issue Analytics

  • State:open
  • Created 7 years ago
  • Comments:16 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
ghostcommented, Apr 22, 2016

I actually just got this working in an even simpler way. Now doing:

  initialize: function() {
    this.on('fetching', function(model, columns, options) {
      if(columns[0] === `${this.tableName}.*` || columns.indexOf('location') >= 0) {
        columns.push(st.asGeoJSON('location'));
      }
    });

    return bookshelf.Model.prototype.initialize.apply(this, arguments);
  }

It just checks to see if the location should be returned as defined by what’s in the columns parameter. If location should be returned, it just pushes st.asGeoJSON('location') to columns. Knex takes care of modifying the query after that.

Applying the initialize arguments should only be necessary if you’re using some other library that also modifies the initialize function; as you stated before.

0reactions
vellotiscommented, Oct 19, 2016

I think the last mentioned problem can be solved with parse & format. And as you do not know automatically which fields need to be parsed when fetching from and formatted when saving/updating to database, it is required to declare them in Model.

For example: I’m using MySQL 5.7 which supports JSON type. But mysql2 doesn’t support it straight forward, It means it handles it as a string. So I have to parse/stringify the field every time I perform read/write operation. For this I have made a Bookshelf plugin for myself:

'use strict';
var Promise = require('bluebird'),
    _ =         require('lodash');

function parseJsonColumns(attrs) {
  if (attrs && this.jsonColumns) {
    this.jsonColumns.forEach(function(key) {
      if (_.isString(attrs[key])) {
        attrs[key] = JSON.parse(attrs[key]);
      }
    });
  }
  return attrs;
};

function formatJsonColumns(attrs) {
  if (attrs && this.jsonColumns) {
    this.jsonColumns.forEach(function(key) {
      if (_.isPlainObject(attrs[key])) {
        attrs[key] = JSON.stringify(attrs[key]);
      }
    });
  }
  return attrs;
};

module.exports = function(Bookshelf) {
  if (!Bookshelf) {
    throw new Error('Must pass an initialized bookshelf instance');
  }
  var ModelCtor = Bookshelf.Model;

  Bookshelf.Model = ModelCtor.extend({
    parse: function(attrs) {
      parseJsonColumns.apply(this, arguments);
      return ModelCtor.prototype.parse.apply(this, arguments);
    },
    format: function(attrs) {
      formatJsonColumns.apply(this, arguments);
      return ModelCtor.prototype.parse.format(this, arguments);
    }
  });
};

Then you shall declare jsonColumns on the model:

const Events = bookshelf.Model.extend({
    tableName: 'events',
    jsonColumns: ['someColumnName', 'otherColumnName']
})

So quick plugin refactoring for PostGIS:

'use strict';
const Promise = require('bluebird'),
      _ =           require('lodash');

module.exports = function(Bookshelf) {
  if (!Bookshelf) {
    throw new Error('Must pass an initialized bookshelf instance');
  }
  const ModelCtor = Bookshelf.Model,
        st = Bookshelf.knex.postgis;

  function parseGISColumns(attrs) {
    if (attrs && this.GISColumns) {
      this.GISColumns.forEach(function(key) {
        if (_.isString(attrs[key])) {
          attrs[key] = JSON.parse(attrs[key])
        }
      })
    }
    return attrs
  }

  function formatGISColumns(attrs) {
    if (attrs && this.GISColumns) {
      this.GISColumns.forEach(function(key) {
        if (_.isPlainObject(attrs[key])) {
          attrs[key] = st.geomFromGeoJSON(attrs[key])
        }
      });
    }
    return attrs
  }

  Bookshelf.Model = ModelCtor.extend({
    initialize () {
      this.on('fetching', function (model, columns, options) {
        const gisColumns = _.intersection(this.GISColumns, columns)
        if (!_.isEmpty(gisColumns)) {
          gisColumns.forEach(function (column) {
                        const index = columns.indexOf(column)
            columns.splice(index, 1, st.asGeoJSON(column))
          })
        } else if (_.includes(columns, `${this.tableName}.*`)) {
          columns.push.apply(columns, _this.GISColumns(st.asGeoJSON))
        }
      })
      return ModelCtor.prototype.parse.apply(this, arguments)
    }
    // Actually this may not be needed as returned value should already be in JSON format
    parse (attrs) {
      parseGISColumns.apply(this, arguments)
      return ModelCtor.prototype.parse.apply(this, arguments)
    }
    format (attrs) {
      formatGISColumns.apply(this, arguments)
      return ModelCtor.prototype.parse.format(this, arguments)
    }
  })
};

And use it jus by defining GISColumns:

const Events = bookshelf.Model.extend({
    tableName: 'events',
    gisColumns: ['location']
})
Read more comments on GitHub >

github_iconTop Results From Across the Web

SQL queries to change the column type
This article will show the way to change the data type of the columns in SQL Server 2019, MySQL Server, and PostgreSQL.
Read more >
Access 2010: More Query Design Options - GCF Global
In this lesson, you'll learn how to modify and sort your queries within Query Design view, as well as how to use the...
Read more >
Spring Data JPA @Modifying Annotation - Baeldung
The @Modifying annotation is used to enhance the @Query annotation so that we can execute not only SELECT queries, but also INSERT, UPDATE,...
Read more >
Using Lookup and multivalued fields in queries
Add criteria to a multivalued field in a query · Open the query in Design View. · In this example, add the Issues...
Read more >
Catching latest column value change in SQL - Stack Overflow
If you add another column 'changed datetime' you can fill this using an update trigger that inserts NOW(). If you query your table...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found