-
-
Notifications
You must be signed in to change notification settings - Fork 187
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
miggration from 8.11.4 to 10.1.3 memory shortage on github action #907
Comments
Alternatively, you could also try to use I dont know anything about MMS that could leak that much and i also am not aware of any change in mongodb that would lead to a massive increase. Do you clean your database between test-suites? Is the log / repository public? If so i it would be great to investigate there. |
@hasezoey I could try the version 9 too indeed (they might have a higher default binary that true) yes i do have events between test to clean and clear: // DROP TEST DABASE
beforeEach(async () => await dbhandler.clearDatabase()) the create and clear event are quite standard: const mongoose = require("mongoose")
const { MongoMemoryServer } = require("mongodb-memory-server")
let mongo
module.exports.connect = async () => {
const systemBinary = process.env.MONGOMS_SYSTEM_BINARY || undefined
const version = process.env.MONGOMS_VERSION || "6.0.19"
mongo = await MongoMemoryServer.create({ binary: { version, systemBinary } })
const mongoUri = await mongo.getUri()
await mongoose.set("strictQuery", true)
await mongoose.connect(mongoUri)
}
module.exports.clearDatabase = async () => {
const collections = await mongoose.connection.db.collections()
for (let collection of collections) {
await collection.deleteMany({})
}
} I don't make usage of |
To confirm, are you using jest with global-setup? Or do you start a mongodb-memory-server instance for each test-suite? |
Yes i do have a global setup for jest that is used by all the tests after. "jest": {
"testEnvironment": "node",
"setupFilesAfterEnv": [
"./src/test/setup_test.js"
]
} The main differences between the local and cicd are the command i launch that are respectivly: {
"test": "jest --watchAll --runInBand --verbose",
"test:ci": "jest --forceExit --detectOpenHandles --maxWorkers=10",
} to give you an idea of how would looklike a test: const request = require("supertest")
const app = require("../../../../app")
const eventBuilder = async () => {
const listener = new PreprofileCreatedList(NatsWrapper.client())
const data = global.preprofileGenerator({})
const msg = { ack: jest.fn() }
return { listener, data, msg }
}
it("returns a 200 when fetching list of preprofiles as admin", async () => {
// GENERATE AND APPROVE 3 ARCHITECT PREPROFILES
for (let i = 0; i < 3; i++) {
const { listener, data, msg } = await eventBuilder()
await listener.onMessage(data, msg)
const preprofile = await PreProfile.findById(data._id)
expect(preprofile._id.toString()).toEqual(data._id)
}
// FETCH ARCHITECT WITH ADMIN API
const adminCookie = await global.adminRegister()
const page = 0
const limit = 100
const { body } = await request(app)
.get(`/api/architect/admin/example-endpoint/list?page=${page}&limit=${limit}`)
.set({ Host: "www.example.com" })
.set("Cookie", adminCookie)
.expect(200)
expect(body[0].total).toEqual([{ count: 3 }])
expect(body[0].preprofiles).toHaveLength(3)
}) |
EDIT: Seems to be that JEST increased heap usage with each test, which lead to this ENOMEM. So my problem below is not directly with MMS, but with JEST instead. Hey, I'm not sure if this is related, but since updating Node in my dockerfile vom 20 to 22.13 and then also updating the mongodb-memory-server to 10.1.3, I see a new ENOMEM error when testing, which didn't happen before that. I read that it might have to do with virtual memory and that increasing vm.max_map_count might help. However since I'm using an AWS build environment and FARGATE I think that's not possible there. (Still researching). However I'll try changing the binary and see if that helps. Didn't try that so far. Here the error:
|
Well this is weird, this means that the mongodb binary couldnt even start because it has no memory, to my knowledge this issue(#907) is about running out of memory while a binary is already running (though i dont know yet if the issue is mongodb, MMS, or something else here). I dont think i can help in your case, you will need to look why it is already out-of-memory at that point. (if it is actually MMS, please open a new issue) |
It is an issue with v10 obviously. When I used v10, resources were created in |
From what i recall and can tell from comparing (between Thanks for providing repro code, and i can reproduce it, weirdly enough. Though note that even without modifications your provided code fails. Fail logs of repro code$ node --test --import ./test.setup.mjs
(node:11139) [DEP0040] DeprecationWarning: The `punycode` module is deprecated. Please use a userland alternative instead.
(Use `node --trace-deprecation ...` to show where the warning was created)
▶ TaskService.create
✔ creates a task and returns it (23.514855ms)
✔ TaskService.create (65.394208ms)
▶ TaskService.getAllForUser
✔ returns all the tasks for a user (13.911151ms)
✔ TaskService.getAllForUser (14.167921ms)
▶ TaskService.getOneWithUser
✔ throws an error if task is not found (3.927656ms)
✔ returns a task with the user details (10.449805ms)
✔ TaskService.getOneWithUser (14.713043ms)
(node:11140) [DEP0040] DeprecationWarning: The `punycode` module is deprecated. Please use a userland alternative instead.
(Use `node --trace-deprecation ...` to show where the warning was created)
▶ UserService.create
✔ creates and returns a user (25.261628ms)
✔ throws an error when email exists already (6.768191ms)
✔ UserService.create (72.224372ms)
▶ UserService.getById
✖ throws an error if user is not found (3.618324ms)
AssertionError [ERR_ASSERTION]: Missing expected rejection.
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Test.run (node:internal/test_runner/test:935:9)
at async Promise.all (index 0)
at async Suite.run (node:internal/test_runner/test:1320:7)
at async Test.processPendingSubtests (node:internal/test_runner/test:633:7) {
generatedMessage: false,
code: 'ERR_ASSERTION',
actual: undefined,
expected: undefined,
operator: 'rejects'
}
✖ returns a user with their total tasks (7.296001ms)
TypeError [Error]: email.toLowerCase is not a function
at Module.create (file:///home/hasezoey/Downloads/test/nodejs-test-runner-mongoose/user/service.mjs:16:61)
at TestContext.<anonymous> (file:///home/hasezoey/Downloads/test/nodejs-test-runner-mongoose/user/__tests__/service.test.mjs:50:21)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Test.run (node:internal/test_runner/test:935:9)
at async Suite.processPendingSubtests (node:internal/test_runner/test:633:7)
✖ UserService.getById (11.493237ms)
ℹ tests 8
ℹ suites 5
ℹ pass 6
ℹ fail 2
ℹ cancelled 0
ℹ skipped 0
ℹ todo 0
ℹ duration_ms 406.52901
✖ failing tests:
test at user/__tests__/service.test.mjs:34:5
✖ throws an error if user is not found (3.618324ms)
AssertionError [ERR_ASSERTION]: Missing expected rejection.
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Test.run (node:internal/test_runner/test:935:9)
at async Promise.all (index 0)
at async Suite.run (node:internal/test_runner/test:1320:7)
at async Test.processPendingSubtests (node:internal/test_runner/test:633:7) {
generatedMessage: false,
code: 'ERR_ASSERTION',
actual: undefined,
expected: undefined,
operator: 'rejects'
}
test at user/__tests__/service.test.mjs:38:5
✖ returns a user with their total tasks (7.296001ms)
TypeError [Error]: email.toLowerCase is not a function
at Module.create (file:///home/hasezoey/Downloads/test/nodejs-test-runner-mongoose/user/service.mjs:16:61)
at TestContext.<anonymous> (file:///home/hasezoey/Downloads/test/nodejs-test-runner-mongoose/user/__tests__/service.test.mjs:50:21)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async Test.run (node:internal/test_runner/test:935:9)
at async Suite.processPendingSubtests (node:internal/test_runner/test:633:7)
error Command failed with exit code 1. I will further investigate this, but from my first findings, somehow the paths in the logs dont match up with the ones that still exist after, and the ones that exist after are also never mentioned in the logs. Very weird. |
Hello @hasezoey yes. The two tests that fail were supposed to fail. Can you point to the part of the documentation where one can explicitly enable the Explicit Resource Management feature? What do you mean by the paths in the logs don't match up with the ones that still exist after, and the ones that exist after are also never mentioned in the logs.? |
In the logs (when running with debug logs), it for example only says it creates I have further debugged this, it seems like the problem is the killer script, it somehow gets the TL;DR: somehow the tests I have not seen this behavior outside of This is definitely a problem, though i dont know why, but unless @erwanriou can confirm this behavior in the jest case, it is a different problem. |
Thank you for this. It's insightful. |
I have created #912 to track this.
I dont think this is necessary, see nodejs/node#52930 and nodejs/node#52939. |
Sorry, forgot about this, see FAQ: Does this package support Explicit Resource Management? and the ECMAScript ECMAScript Explicit Resource Management Proposal. mongodb-memory-server example: function main() {
await using dbServer = MongoMemoryServer.create();
// do your stuff
// automatically disposed on the end of the scope of the variable, calling ".stop", unless disabled via options)
} |
Sorry guys i completely forget to checkout on my logs. My issue was indeed when launching jest and not related to node:test. - uses: actions/checkout@v3
- name: Check initial disk space
run: df -h
- name: Clear temporary files
run: sudo rm -rf /tmp/* || true
- name: Install dependencies
run: |-
echo -e "//npm.pkg.github.com/:_authToken=${{secrets.CI_TOKEN}} \n@archsplace:registry=https://npm.pkg.github.com/archsplace \nregistry=https://registry.npmjs.org \ntimeout=60000 \nupdate-notifier=false" > .npmrc && npm i
- name: Preload MongoDB binary
run: |
mkdir -p /tmp/mongodb-binaries
curl -o /tmp/mongodb-binaries/mongodb-linux-x86_64-ubuntu2204-6.0.19.tgz https://fastdl.mongodb.org/linux/mongodb-linux-x86_64-ubuntu2204-6.0.19.tgz
tar -xvzf /tmp/mongodb-binaries/mongodb-linux-x86_64-ubuntu2204-6.0.19.tgz -C /tmp/mongodb-binaries
echo "MongoDB binary downloaded and extracted."
- name: Run tests
env:
MONGOMS_DOWNLOAD_URL: https://fastdl.mongodb.org/linux/mongodb-linux-x86_64-ubuntu2204-6.0.19.tgz
MONGOMS_SYSTEM_BINARY: /tmp/mongodb-binaries/mongodb-linux-x86_64-ubuntu2204-6.0.19/bin/mongod
run: |
npm run test:ci || {
echo "Test failure encountered (possibly ENOSPC). Capturing /tmp disk usage:";
du -xh /tmp;
echo "Process list (sorted by memory):";
ps aux --sort=pmem;
exit 1;
} And as crasy as it seems i tried to reproduce the bug but now all the test keep passing with the latest version. It's true that i am enforcing a specific mongodb version but still i don't know why it would be working now. I could remove the logic enforcing a specific mongodb binary and check maybe... |
Versions
package: mongodb-memory-server
What is the Problem?
i updated the mongodb-memory-server to latest in order to pass a test that is making use of a feature added in mongodb 5.2 and test passed locally (but much much slower tho). Since i deployed it on the CI but the github action timeout due to memory when i don't even have that much test in it.
Code Example
I have a total of 78 test suite (in some other be i have more than 200 without any issues)

I believe a screen is better than more explainations:
So i believe there might be some sort of memoryleak underneath that generate the issue. In the meantime i will do
The text was updated successfully, but these errors were encountered: