question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

@Indexed(unique=true) does not cause Dupe Key error

See original GitHub issue

When you mark a field as unique, insertion of a new document with the same value in that field should throw a duplicate key error. I can currently insert 2 documents with duplicate values via spring boot.

<parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.3.2.RELEASE</version>
</parent>
    ....
<dependency>
            <groupId>de.flapdoodle.embed</groupId>
            <artifactId>de.flapdoodle.embed.mongo</artifactId>
            <scope>test</scope>
</dependency>

Create a user class:

import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.index.Indexed;
import org.springframework.data.mongodb.core.mapping.Document;

@Document(collection = "appUsers")
public class User {
    @Id
    private String id;

    @Indexed(unique = true) 
    private String userName;

    @Indexed(unique = true)
    private String email;

In a test I can insert twice:

User user1 = new User("foobar","foo@bar.com"); // userName and email address in constructor
User user2 = new User("foobar","foo@bar.com"); 
userRepository.insert(user1);
userRepository.insert(user2); // expect an error here

userRepository is Autowired is a standard mongo repository:

public interface UserRepository extends MongoRepository<User,String> 

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:2
  • Comments:13

github_iconTop GitHub Comments

12reactions
ClimberBearcommented, Jan 10, 2021

I have solved adding the option spring.data.mongodb.auto-index-creation: true in the application.yml configuration file

1reaction
nogenemcommented, Apr 2, 2022

Just an observation for other people: I just ran into this problem and setting the auto-index-creation property to true didn’t fix it. The reason was cause i’m using a custom Mongo configuration where i extend AbstractMongoClientConfiguration, so i had to add this to the config class:

@Override
public boolean autoIndexCreation() {
    return true;
}

Now the index is working ;p

Read more comments on GitHub >

github_iconTop Results From Across the Web

E11000 duplicate key error not throwing for unique index ...
1 Answer 1 ... It looks like the issue was caused by the npm module mongoose-delete which enables soft deleting. Soft deleting means...
Read more >
Add UNIQUE index fails with duplicate entry error, but no ...
Attempt 1: If I try to create the index after the data is in the table it fails with duplicate key error. A...
Read more >
Could not create unique index: how to solve duplication errors
Unfortunately, postgres cannot create a unique index as long as the duplicate items exist. First things first: how many duplicates do you have?...
Read more >
I ended up adding duplicate records on a unique index in ...
If there is a collection that has duplicate records, then you cannot add a unique index to the collection unless you delete those...
Read more >
Error: Cannot insert duplicate key row in... a non-unique index?!
Actually, the column values cited in the error don't line up the key columns in the index. I would say that the error...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found