question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Big data set issue

See original GitHub issue

Hello, I have an issue with handling big data set in Vue.js

The app: The app has like 50k objects in its memory, lets call them items. Every item has an _id and a name property. I want to list the items to the user one at a time. The user picks an item then i put the item in the Vue instance to render it on screen.

The problem: The data set is like 4 mb big. If i put all the data set (the items object) into a vue instance the 4mb data with the observer properties will be 85 mb. I tried to put the data set into a vuex store, the result was 89mb of memory usage. (This is only a test case, the real production data is like 100mb so in a vuex store it will be instant out of memory i guess).

So I did not put all the data in the instance cuz i do not want reactivity for all the objects, only the one the user picked at that time. So i store the data set in a plain object and if a user pick a an item i put the item into the vue instance then i will have reactivity for that one item. Everything good expects it causes memory leak. Every time i put an item into a vue instance a new observer property will be added to the object. Even if the object was in the instance before a new observer is added next to the unused ones. This works as intended i guess but my issue/feature request is: Can i somehow remove unused observers and getter/setters or are there any other practices to handle big data set in vue?

Things that would solve the issue but I can’t use them: I could deep clone the item before put it into the vue instance, but i need to keep the original object refence cuz other parts of the app updates the object and i want to keep reactivity. Could use a database engine but I need to have offline capability and fast IOless memory search locally.

Here are the test cases I made:

// data in Vuex store 89mb
var store = new Vuex.Store({
    state: {
        items: {}
    },
    mutations: {
        addItem(state, item) {
            Vue.set(state.items, item._id, item);
        }
    }
});

for (var index = 0; index < 50000; index++) {
    store.commit('addItem', { _id: index, name: 'item' + index })
}

var app = new Vue({
    el: '#app',
    data: {
        pickedItemId: 0
    },
    computed: {
        item() {
            return store.state.items[this.pickedItemId]
        }
    }
});
//all in the instance 85mb
var app = new Vue({
    el: '#app',
    data: {
        items: (function () {
            var elems = {};

            for (var index = 0; index < 50000; index++) {
                elems[index] = { _id: index, name: 'item' + index }
            }

            return elems;
        } ()),
        item: {}
    }
});
//plain object 4mb but memory leak every time when i update app.item with the picked item
var items = (function () {
    var elems = {};

    for (var index = 0; index < 50000; index++) {
        elems[index] = { _id: index, name: 'item' + index }
    }

    return elems;
} ())

var app = new Vue({
    el: '#app',
    data: {
        item: {}
    }
});

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Reactions:4
  • Comments:23 (9 by maintainers)

github_iconTop GitHub Comments

57reactions
yyx990803commented, Dec 8, 2016

Will any of these items be mutated? If they are not, you can simply call Object.freeze on them before setting it into the Vue instances. This will let Vue know that it can skip observing the internals of these items and basically solve your memory issue.

Even if you do need to mutate them, you can instead do something like this.item = Object.freeze(Object.assign({}, this.item, changedFields)).

14reactions
kunKun-txcommented, Aug 14, 2017

Object.freeze saves almost 800MB of memory on my big 5K nested array!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Eleven tips for working with large data sets - Nature
Big data sets are too large to comb through manually, so automation is key, says Shoaib Mufti, senior director of data and technology...
Read more >
15 Big Data Problems You Need to Solve - SolveXia
15 Big Data Problems You Need to Solve · 1. Lack of Understanding · 2. High Cost of Data Solutions · 3. Too...
Read more >
Top 6 Big Data Challenges and Solutions to Overcome
Lack of knowledge Professionals · Lack of proper understanding of Massive Data · Data Growth Issues · Confusion while Big Data Tool selection....
Read more >
Top 6 Major Challenges of Big Data & Simple Solutions To ...
1. Lack of proper understanding of Big Data ... Companies fail in their Big Data initiatives due to insufficient understanding. Employees may not...
Read more >
3 big problems with datasets in AI and machine learning
Datasets in AI and machine learning contain many flaws. Some might be fixable, according to experts -- given enough time and resources.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found