class Person {
String firstName
String lastName
static hasMany = [pets:Pet]
}
4 Mapping Domain Classes
Version: 2023.3.0
Table of Contents
4 Mapping Domain Classes
Basic Mapping
The way GORM for MongoDB works is to map each domain class to a Mongo collection. For example given a domain class such as:
This will map onto a MongoDB Collection called "person".
Embedded Documents
It is quite common in MongoDB to embed documents within documents (nested documents). This can be done with GORM embedded types:
class Person {
String firstName
String lastName
Address address
static embedded = ['address']
}
You can map embedded lists and sets of documents/domain classes:
class Person {
String firstName
String lastName
Address address
List otherAddresses
static embedded = ['address', 'otherAddresses']
}
You can also embed maps of embedded classes where the keys are strings:
class Person {
String firstName
String lastName
Map<String,Address> addresses
static embedded = ['addresses']
}
Basic Collection Types
You can also map lists and maps of basic types (such as strings) simply by defining the appropriate collection type:
class Person {
List<String> friends
Map pets
}
...
new Person(friends:['Fred', 'Bob'], pets:[chuck:"Dog", eddie:'Parrot']).save(flush:true)
Basic collection types are stored as native ArrayList and BSON documents within the Mongo documents.
Customized Collection and Database Mapping
You may wish to customize how a domain class maps onto a MongoCollection
. This is possible using the mapping
block as follows:
class Person {
..
static mapping = {
collection "mycollection"
database "mydb"
}
}
In this example we see that the Person
entity has been mapped to a collection called "mycollection" in a database called "mydb".
You can also control how an individual property maps onto a Mongo Document field (the default is to use the property name itself):
class Person {
..
static mapping = {
firstName attr:"first_name"
}
}
If you are using the mapping engine, for non-embedded associations by default GORM for MongoDB will map links between documents using MongoDB database references also known as DBRefs
.
If you prefer not to use DBRefs then you tell GORM to use direct links by using the reference:false
mapping:
class Person {
..
static mapping = {
address reference:false
}
}
4.1 Identity Generation
By default in GORM entities are supplied with an integer-based identifier. So for example the following entity:
class Person {}
Has a property called id
of type java.lang.Long
. In this case GORM for Mongo will generate a sequence based identifier using the technique described in the Mongo documentation on Atomic operations.
However, sequence based integer identifiers are not ideal for environments that require sharding (one of the nicer features of Mongo). Hence it is generally advised to use either String based ids:
class Person {
String id
}
Or a native BSON ObjectId:
import org.bson.types.ObjectId
class Person {
ObjectId id
}
BSON ObjectId
instances are generated in a similar fashion to UUIDs
.
Assigned Identifiers
Note that if you manually assign an identifier, then you will need to use the insert
method instead of the save
method, otherwise GORM can’t work out whether you are trying to achieve an insert or an update. Example:
class Person {
String id
}
...
Person p = new Person()
p.id = "Fred"
// to insert
p.insert()
// to update
p.save()
4.2 Understanding Dirty Checking
In order to be as efficient as possible when it comes to generating updates GORM for MongoDb will track changes you make to persistent instances.
When an object is updated only the properties or associations that have changed will be updated.
You can check whether a given property has changed by using the hasChanged
method:
if (person.hasChanged('firstName')) {
// do something
}
This method is defined by the org.grails.datastore.mapping.dirty.checking.DirtyCheckable
trait.
In the case of collections and association types GORM for MongoDB will wrap each collection in a dirty checking aware collection type.
One of the implications of this is if you override the collection with a non-dirty checking aware type it can disable dirty checking and prevent the property from being updated.
If any of your updates are not updating the properties that you anticipate you can force an update using the markDirty() method:
person.markDirty('firstName')
This will force GORM for MongoDB to issue an update for the given property name.
Dirty Checking and Proxies
Dirty checking uses the equals()
method to determine if a property has changed. In the case of associations, it is important to recognize that if the association is a proxy, comparing properties on the domain that are not related to the identifier will initialize the proxy, causing another database query.
If the association does not define equals()
method, then the default Groovy behavior of verifying the instances are the same will be used. Because proxies are not the same instance as an instance loaded from the database, which can cause confusing behavior. It is recommended to implement the equals()
method if you need to check the dirtiness of an association. For example:
class Author {
Long id
String name
/**
* This ensures that if either or both of the instances
* have a null id (new instances), they are not equal.
*/
@Override
boolean equals(o) {
if (!(o instanceof Author)) return false
if (this.is(o)) return true
Author that = (Author) o
if (id !=null && that.id !=null) return id == that.id
return false
}
}
class Book {
Long id
String title
Author author
}
4.3 Customizing the WriteConcern
A feature of MongoDB is its ability to customize how important a database write is to the user. The Java client models this as a WriteConcern and there are various options that indicate whether the client cares about server or network errors, or whether the data has been successfully written or not.
If you wish to customize the WriteConcern
for a domain class you can do so in the mapping block:
import com.mongodb.WriteConcern
class Person {
String name
static mapping = {
writeConcern WriteConcern.FSYNC_SAFE
}
}
For versioned entities, if a lower level of WriteConcern than WriteConcern.ACKNOWLEDGE is specified, WriteConcern.ACKNOWLEDGE will also be used for updates, to ensure that optimistic locking failures are reported. |
4.4 Dynamic Attributes
Unlike a relational database, MongoDB allows for "schemaless" persistence where there are no limits to the number of attributes a particular document can have. A GORM domain class on the other hand has a schema in that there are a fixed number of properties. For example consider the following domain class:
class Plant {
boolean goesInPatch
String name
}
Here there are two fixed properties, name
and goesInPatch
, that will be persisted into the MongoDB document. Using GORM for MongoDB you can however use dynamic properties via the Groovy subscript operator. For example:
def p = new Plant(name:"Pineapple")
p['color'] = 'Yellow'
p['hasLeaves'] = true
p.save()
p = Plant.findByName("Pineapple")
println p['color']
println p['hasLeaves']
Using the subscript operator you can add additional attributes to the underlying Document
instance that gets persisted to the MongoDB allowing for more dynamic domain models.
4.5 Custom User Types
GORM for MongoDB will persist all common known Java types like String, Integer, URL etc., however if you want to persist one of your own classes that is not a domain class you can implement a custom user type.
Custom Codecs
GORM for MongoDB is built ontop of MongoDB’s BSON encoding framework. This means it is possible to implement custom Codecs for encoding and decoding values to and from BSON.
For example consider the following simple Groovy class:
class Birthday {
Date date
}
By default the encoding engine does not know how to represent this type as a BSON value. To make the encoding engine understand this type you have to implement a custom codec:
import org.bson.*
import org.bson.codecs.*
class BirthdayCodec implements Codec<Birthday> {
Birthday decode(BsonReader reader, DecoderContext decoderContext) {
return new Birthday(date: new Date(reader.readDateTime())) (1)
}
void encode(BsonWriter writer, Birthday value, EncoderContext encoderContext) {
writer.writeDateTime(value.date.time) (2)
}
Class<Birthday> getEncoderClass() { Birthday } (3)
}
1 | Decodes the Birthday type from the BsonReader |
2 | Encodes the Birthday type to the BsonWriter |
3 | Returns the type that is to be encoded. In this case Birthday . |
With that done you then need to register the custom Codec
. There are two ways to achieve this.
You can register a list of codecs in the grails.mongodb.codecs
setting in application.yml
:
grails:
mongodb:
codecs:
- my.company.BirthdayCodec
Or you can create a META-INF/services/org.bson.codecs.Codec
file containing the fully qualified class name of the Codec
. If there are multiple codec classes you would like to register, put each one on a separate line.
Custom Types with GORM
Another option is to define a GORM custom type. For example consider the following class:
class Birthday implements Comparable{
Date date
Birthday(Date date) {
this.date = date
}
@Override
int compareTo(Object t) {
date.compareTo(t.date)
}
}
Custom types should go in src/groovy not app/domain
|
If you attempt to reference this class from a domain class it will not automatically be persisted for you. However you can create a custom type implementation and register it with Spring. For example:
import groovy.transform.InheritConstructors
import org.bson.Document
import org.grails.datastore.mapping.engine.types.AbstractMappingAwareCustomTypeMarshaller
import org.grails.datastore.mapping.model.PersistentProperty
import org.grails.datastore.mapping.mongo.query.MongoQuery
import org.grails.datastore.mapping.query.Query
@InheritConstructors
class BirthdayType extends AbstractMappingAwareCustomTypeMarshaller<Birthday, Document, Document> {
@Override
protected Object writeInternal(PersistentProperty property, String key, Birthday value, Document nativeTarget) {
final converted = value.date.time
nativeTarget.put(key, converted)
return converted
}
@Override
protected void queryInternal(PersistentProperty property, String key, PropertyCriterion criterion, Document nativeQuery) {
if (criterion instanceof Between) {
def dbo = new BasicDBObject()
dbo.put(MongoQuery.MONGO_GTE_OPERATOR, criterion.getFrom().date.time)
dbo.put(MongoQuery.MONGO_LTE_OPERATOR, criterion.getTo().date.time)
nativeQuery.put(key, dbo)
}
else {
nativeQuery.put(key, criterion.value.date.time)
}
}
@Override
protected Birthday readInternal(PersistentProperty property, String key, Document nativeSource) {
final num = nativeSource.get(key)
if (num instanceof Long) {
return new Birthday(new Date(num))
}
return null
}
})
The above BirthdayType
class is a custom user type implementation for MongoDB for the Birthday
class. It provides implementations for three methods: readInternal
, writeInternal
and the optional queryInternal
. If you do not implement queryInternal
your custom type can be persisted but not queried.
The writeInternal
method gets passed the property, the key to store it under, the value and the native DBObject where the custom type is to be stored:
@Override
protected Object writeInternal(PersistentProperty property, String key, Birthday value, DBObject nativeTarget) {
final converted = value.date.time
nativeTarget.put(key, converted)
return converted
}
You can then read the values of the custom type and register them with the DBObject
. The readInternal
method gets passed the PersistentProperty
, the key the user type info is stored under (although you may want to use multiple keys) and the DBObject
:
@Override
protected Birthday readInternal(PersistentProperty property, String key, Document nativeSource) {
final num = nativeSource.get(key)
if(num instanceof Long) {
return new Birthday(new Date(num))
}
return null
}
You can then construct the custom type by reading values from the DBObject
. Finally the queryInternal
method allows you to handle how a custom type is queried:
@Override
protected void queryInternal(PersistentProperty property, String key, Query.PropertyCriterion criterion, Document nativeQuery) {
if(criterion instanceof Between) {
def dbo = new BasicDBObject()
dbo.put(MongoQuery.MONGO_GTE_OPERATOR, criterion.getFrom().date.time);
dbo.put(MongoQuery.MONGO_LTE_OPERATOR, criterion.getTo().date.time);
nativeQuery.put(key, dbo)
}
else if(criterion instanceof Equals){
nativeQuery.put(key, criterion.value.date.time)
}
else {
throw new RuntimeException("unsupported query type for property $property")
}
}
The method gets passed a criterion
which is the type of query and depending on the type of query you may handle the query differently. For example the above implementation supports between
and equals
style queries. So the following 2 queries will work:
Person.findByBirthday(new Birthday(new Date()-7)) // find someone who was born 7 days ago
Person.findByBirthdayBetween(new Birthday(new Date()-7), new Birthday(new Date())) // find someone who was born in the last 7 days
However "like" or other query types will not work.
To register a custom type in a Grace application simply register it as Spring bean. For example, to register the above BirthdayType
add the following to grails-app/conf/spring/resources.groovy:
import com.example.*
// Place your Spring DSL code here
beans = {
birthdayType(BirthdayType, Birthday)
}