Meteor Database Collections

meteor

http://stackoverflow.com/questions/39684374/meteor-package-that-automatically-remove-script-tags
https://github.com/aldeed/meteor-collection2 - done reading
https://github.com/aldeed/meteor-collection2#autovalue
https://github.com/aldeed/meteor-collection2#custom - done reading
https://github.com/aldeed/meteor-simple-schema - done reading
https://github.com/meteor/validated-method
https://github.com/aldeed/meteor-schema-index - done reading
https://github.com/aldeed/meteor-schema-deny - done reading
https://atmospherejs.com/davidyaha/collection2-migrations - done reading
https://docs.meteor.com/api/collections.html - done reading
http://guide.meteor.com/collections.html
https://github.com/matb33/meteor-collection-hooks - done reading
https://www.discovermeteor.com/blog/a-look-at-meteor-collection-hooks/
https://www.youtube.com/watch?v=WABQiAwqVJg

How can we validate against multiple schemas?

Normally, if call attachSchema multiple times, the schemas are merged. If you use the replace: true option, then it will replace the previously attached schema. However, in some cases you might actually want both schemas attached, with different documents validated against different schemas.

Products.attachSchema(SimpleProductSchema, {selector: {type: 'simple'}});
Products.attachSchema(VariantProductSchema, {selector: {type: 'variant'}});

Now both schemas are attached. When you insert a document where type: 'simple' in the document, it will validate against only the SimpleProductSchema. When you insert a document where type: 'variant' in the document, it will validate against only the VariantProductSchema. Alternatively, you can pass a selector option when inserting to choose which schema to use:

Products.insert({ title: 'This is a product' }, { selector: { type: 'simple' } });

For an update or upsert, the matching selector can be in the query, the modifier $set object, or the selector option.

What is the purpose of the transform option for the attachSchema statement?

If your validation requires that your doc be transformed using the collection's transform function prior to being validated, then you must pass the transform: true option to attachSchema when you attach the schema:

Books.attachSchema(Schemas.Book, {transform: true});

Read the documentation for Mongo.Collection and see what it offers.

What is the purpose of the replace option of the attachSchema method?

By default, if a collection already has a schema attached, attachSchema will combine the new schema with the existing. Pass the replace: true option to attachSchema to discard any existing schema.

How can we check a document against a schema?

check(doc, MyCollection.simpleSchema());

How can we get the list of invalid keys?

Books.insert({title: "Ulysses", author: "James Joyce"}, function(error, result) {
  // The insert will fail, error will be set,
  // and result will be undefined or false because "copies" is required.
  //
  // The list of errors is available on `error.invalidKeys` or by calling Books.simpleSchema().namedContext().validationErrors()
});

Books.update(book._id, {$unset: {copies: 1}}, function(error, result) {
  // The update will fail, error will be set,
  // and result will be undefined or false because "copies" is required.
  //
  // The list of errors is available on `error.invalidKeys` or by calling Books.simpleSchema().namedContext().validationErrors()
});

The callback you specify as the last argument of your insert() or update() call will have the first argument (error) set to an Error instance. The error message for the first invalid key is set in the error.message, and the full validationErrors array is available on error.invalidKeys. This is true on both client and server, even if validation for a client-initiated operation does not fail until checked on the server.

If you attempt a synchronous operation in server code, the same validation error is thrown since there is no callback to pass it to. If this happens in a server method (defined with Meteor.methods), a more generic Meteor.Error is passed to your callback back on the client. This error does not have an invalidKeys property, but it does have the error message for the first invalid key set in error.reason.

Generally speaking, you would probably not use the Error for displaying to the user. You can instead use the reactive methods provided by the SimpleSchema validation context to display the specific error messages to the user somewhere in the UI. The autoform package provides some UI components and helpers for this purpose.

How many validation context can we have?

Mulitple. In the examples above, note that we called namedContext() with no arguments to access the SimpleSchema reactive validation methods. Contexts let you keep multiple separate lists of invalid keys for a single collection. In practice you might be able to get away with always using the default context. It depends on what you're doing. If you're using the context's reactive methods to update UI elements, you might find the need to use multiple contexts. For example, you might want one context for inserts and one for updates, or you might want a different context for each form on a page.

How can we use a specific validation context?

To use a specific named validation context, use the validationContext option when calling insert or update:

Books.insert({title: "Ulysses", author: "James Joyce"}, { validationContext: "insertForm" }, function(error, result) {
  //The list of errors is available by calling Books.simpleSchema().namedContext("insertForm").validationErrors()
});

Books.update(book._id, {$unset: {copies: 1}}, { validationContext: "updateForm" }, function(error, result) {
  //The list of errors is available by calling Books.simpleSchema().namedContext("updateForm").validationErrors()
});

obj = {title: "Ulysses", author: "James Joyce"};
isValid = BookSchema.namedContext("myContext").validate(obj);
isValid = BookSchema.namedContext("myContext").validateOne(obj, "keyToValidate");
isValid = Match.test(obj, BookSchema);
check(obj, BookSchema);

// Validation errors are available through reactive methods
if (Meteor.isClient) {
  Meteor.startup(function() {
    Tracker.autorun(function() {
      var context = BookSchema.namedContext("myContext");
      if (!context.isValid()) {
        console.log(context.invalidKeys());
      }
    });
  });
}

How can we do validation without actually inserting or updating?

It's also possible to validate a document without performing the actual insert or update:

Books.simpleSchema().namedContext().validate({title: "Ulysses", author: "James Joyce"}, {modifier: false});

Set the modifier option to true if the document is a mongo modifier object.

How can we validate just one key in the document?

Books.simpleSchema().namedContext().validate({title: "Ulysses", author: "James Joyce"}, {modifier: false, keys: ['title']});

How can we insert or update without doing validation?

To skip validation, use the validate: false option when calling insert or update. On the client (untrusted code), this will skip only client-side validation. On the server (trusted code), it will skip all validation. The object is still cleaned and autoValues are still generated.

How can we skip removing properties that are not in the schema when inserting or updating?

To skip object property filtering, set the filter option to false when you call insert or update.

How can we skip conversion of values to match what schema expects when inserting or updating?

To skip automatic value conversion, set the autoConvert option to false when you call insert or update.

How can we skip removing empty strings when inserting or updating?

To skip removing empty strings, set the removeEmptyStrings option to false when you call insert or update.

How can we skip generating automatic values when inserting or updating?

To skip adding automatic values, set the getAutoValues option to false when you call insert or update. This works only in server code.

How can we bypass Collection2 entirely when inserting or updating?

Even if you skip all validation and cleaning, Collection2 will still do some object parsing that can take a long time for a large document. To bypass this, set the bypassCollection2 option to true when you call insert or update. This works only in server code.

What are the things that Collection2 do before every insert or update?

  1. Removes properties from your document or mongo modifier object if they are not explicitly listed in the schema. (To skip this, set the filter option to false when you call insert or update.)
  2. Automatically converts some properties to match what the schema expects, if possible. (To skip this, set the autoConvert option to false when you call insert or update.)
  3. Optimizes your operation so that empty string values will not be stored. (To skip this, set the removeEmptyStrings option to false when you call insert or update.)
  4. Adds automatic (forced or default) values based on your schema. Values are added only on the server and will make their way back to your client when your subscription is updated. (To skip this in server code, set the getAutoValues option to false when you call insert or update.)
  5. Validates your document or mongo modifier object. (To skip this, set the validate option to false when you call insert or update.)
  6. Performs the insert or update like normal, only if it was valid.

Collection2 is simply calling SimpleSchema methods to do these things. The validation happens on both the client and the server for client-initiated actions, giving you the speed of client-side validation along with the security of server-side validation.

How can we enable debugging for Collection2?

You might find yourself in a situation where it seems as though validation is not working correctly. First, you should enable SimpleSchema debug mode by setting SimpleSchema.debug = true, which may log some additional information. If you're still confused, read through the following tricky, confusing situations.

Set SimpleSchema.debug = true in your app before creating any named validation contexts to cause all named validation contexts to automatically log all invalid key errors to the browser console. This can be helpful while developing an app to figure out why certain actions are failing validation.

How can we combine multiple schemas?

If you have schemas that share one or more subproperties, you can define them in a sub-schema to make your code cleaner and more concise. Here's an example:

AddressSchema = new SimpleSchema({
  street: {
    type: String,
    max: 100
  },
  city: {
    type: String,
    max: 50
  },
  state: {
    type: String,
    regEx: /^A[LKSZRAEP]|C[AOT]|D[EC]|F[LM]|G[AU]|HI|I[ADLN]|K[SY]|LA|M[ADEHINOPST]|N[CDEHJMVY]|O[HKR]|P[ARW]|RI|S[CD]|T[NX]|UT|V[AIT]|W[AIVY]$/
  },
  zip: {
    type: String,
    regEx: /^[0-9]{5}$/
  }
});

CustomerSchema = new SimpleSchema({
  billingAddress: {
    type: AddressSchema
  },
  shippingAddresses: {
    type: [AddressSchema],
    minCount: 1
  }
});

Alternatively, if you want to reuse mini-schemas in multiple places but you don't want a subdocument like you get with sub-schemas, you can pass multiple schemas to the SimpleSchema constructor, and they will be combined.

cmsBaseSchema = new SimpleSchema({ ... });
cmsPageSchema = new SimpleSchema([cmsBaseSchema, {additionalField: {type: String} }]);

How can we extract a certain part of a schema?

Sometimes you have one large SimpleSchema object, and you need just a subset of it for some purpose. To pull out certain schema keys into a new schema, you can use the pick method:

var profileSchema = new SimpleSchema({
  firstName: {type: String},
  lastName: {type: String},
  username: {type: String}
});

var nameSchema = profileSchema.pick(['firstName', 'lastName']);

When using pick on a field of type Array you also need to pick the array item field. Take the following as an example:

var profileSchema = new SimpleSchema({
  firstName: {
    type: String
  },
  lastName: {
    type: String
  },
  comments: {
    type: [String]
  }
});

var nameSchema = profileSchema.pick('comments', 'comments.$');

Can we use the dot notation inside the SimpleSchema definition?

Yes. A basic schema key is just the name of the key (property) to expect in the objects that will be validated. If necessary, though, you can use string keys with mongo-style dot notation to validate nested arrays and objects.

MySchema = new SimpleSchema({
    "mailingAddress.street": {
        type: String
    },
    "mailingAddress.city": {
        type: String
    }
});

How can we indicate the presence of an array?

To indicate the presence of an array, use a $:

MySchema = new SimpleSchema({
    "addresses.$.street": {
        type: String
    },
    "addresses.$.city": {
        type: String
    }
});

How can we explicitly define a complex field?

MySchema = new SimpleSchema({
    "mailingAddress.street": {
        type: String
    },
    "mailingAddress.city": {
        type: String
    }
});

MySchema = new SimpleSchema({
    "addresses.$.street": {
        type: String
    },
    "addresses.$.city": {
        type: String
    }
});

In the examples above, we did not explicitly define the mailingAddress object or the addresses array or the addresses.$ object. This is fine because they will be implicitly defined for you. However, note that implicit objects and arrays of objects are assumed to be optional. This means their required properties will only be required if the object itself is present in the document or modifier being validated. So in general, it's clearer if you explicitly define objects and arrays of objects in your schema. Here's an example of explicitly defining an array of objects such that it will be required and have a minimum and maximum array count:

MySchema = new SimpleSchema({
    addresses: {
        type: [Object],
        minCount: 1,
        maxCount: 4
    },
    "addresses.$.street": {
        type: String
    },
    "addresses.$.city": {
        type: String
    }
});

What are the available types?

String, Number, Boolean, Object, Date,

How can we indicate that a field is an array inside our schema defintion?

[String], [Number], [Boolean], [Object], [Date]

How can we do i18n with SimpleSchema?

label: A string that will be used to refer to this field in validation error messages. The default is an inflected (humanized) derivation of the key name itself. For example, the key "firstName" will have a default label of "First name". If you require a field that changes its meaning in some circumstances you can provide a callback function as a label.

MySchema = new SimpleSchema({
  firstName: {
    type: String,
    label: function () {
      return Session.get("lang") == "de"
            ? "Vorname" : "first name";
    }
  }
});

Alternatively, you can use the labels method to alter one or more labels on the fly:

MySchema.labels({
    password: "Enter your password"
});

This method causes reactive labels to update. To get the label for a field, use MySchema.label(fieldName), which returns a usable string. This method is reactive.

How is the 'optional' attribute interpreted for complex object?

By default, all keys are required. Set optional: true to change that. With complex keys, it might be difficult to understand what "required" means. Here's a brief explanation of how requiredness is interpreted:

  1. If type is Array or is an array (any type surrounded by array brackets), then "required" means that key must have a value, but an empty array is fine. (If an empty array is not fine, add the minCount: 1 option.)
  2. For items within an array, or when the key name ends with ".$", the optional option has no effect. That is, something cannot be "required" to be in an array.
  3. If a key is required at a deeper level, the key must have a value only if the object it belongs to is present.
  4. When the object being validated is a Mongo modifier object, changes that would unset or null a required key result in validation errors.

That last point can be confusing, so let's look at a couple examples:

  1. Say you have a required key "friends.address.city" but "friends.address" is optional. If "friends.address" is set in the object you're validating, but "friends.address.city" is not, there is a validation error. However, if "friends.address" is not set, then there is no validation error for "friends.address.city" because the object it belongs to is not present.
  2. If you have a required key "friends.$.name", but the friends array has no objects in the object you are validating, there is no validation error for "friends.$.name". When the friends array does have objects, every present object is validated, and each object could potentially have a validation error if it is missing the name property. For example, when there are two objects in the friends array and both are missing the name property, there will be a validation error for both "friends.0.name" and "friends.1.name".

How should we interpret the min/max attribute?

  1. If type is Number or [Number], these rules define the minimum or maximum numeric value.
  2. If type is String or [String], these rules define the minimum or maximum string length.
  3. If type is Date or [Date], these rules define the minimum or maximum date, inclusive.

You can alternatively provide a function that takes no arguments and returns the appropriate minimum or maximum value. This is useful, for example, if the minimum Date for a field should be "today".

How should we interpret the exclusiveMin/exclusiveMax attribute?

Set to true to indicate that the range of numeric values, as set by min/max, are to be treated as an exclusive range. Set to false (default) to treat ranges as inclusive.

How should we interpret the decimal attribute?

Set to true if type is Number or [Number] and you want to allow non-integers. The default is false.

How should we interpret minCount/maxCount attribute?

Define the minimum or maximum array length. Used only when type is an array or is Array.

How should we interpret the allowedValues attribute?

An array of values that are allowed. A key will be invalid if its value is not one of these.

How should we interpret the 'blackbox' attribute?

If you have a key with type Object, the properties of the object will be validated as well, so you must define all allowed properties in the schema. If this is not possible or you don't care to validate the object's properties, use the blackbox: true option to skip validation for everything within the object.

Custom object types are treated as blackbox objects by default. However, when using collection2, you must ensure that the custom type is not lost between client and server. This can be done with a transform function that converts the generic Object to the custom object. Without this transformation, client-side inserts and updates might succeed on the client but then fail on the server. Alternatively, if you don't care about losing the custom type, you can explicitly set blackbox: true for a custom object type instead of using a transformation.

How should we interpret the 'trim' attribute?

Set to false if the string value for this key should not be trimmed (i.e., leading and trailing spaces should be kept). Otherwise, all strings are trimmed when you call mySimpleSchema.clean().

How should we interpret the 'custom' attribute?

Refer to the Custom Validation section.

How should we interpret the defaultValue attribute?

Set this to any value that you want to be used as the default when an object does not include this field or has this field set to undefined. This value will be injected into the object by a call to mySimpleSchema.clean(). Default values are set only when cleaning non-modifier objects. Note the following points of confusion:

  1. A default value itself is not cleaned. So, for example, if your default value is "", it will not be removed by the removeEmptyStrings operation in the cleaning.
  2. A default value is always added if there isn't a value set. Even if the property is a child of an optional object, and the optional object is not present, the object will be added and its property will be set to the default value. Effectively, this means that if you provide a default value for one property of an object, you must provide a default value for all properties of that object or risk confusing validation errors.

If you need more control, use the autoValue option instead.

How should we interpret the 'autoValue' attribute?

The autoValue option allows you to specify a function that is called by mySimpleSchema.clean() to potentially change the value of a property in the object being cleaned. This is a powerful feature that allows you to set up either forced values or default values, potentially based on the values of other fields in the object.

An autoValue function is passed the document or modifier as its only argument, but you will generally not need it. Instead, the function context provides a variety of properties and methods to help you determine what you should return.

If an autoValue function does not return anything (i.e., returns undefined), the field's value will be whatever the document or modifier says it should be. If that field is already in the document or modifier, it stays in the document or modifier with the same value. If it's not in the document or modifier, it's still not there. If you don't want it to be in the doc or modifier, you must call this.unset().

Any other return value will be used as the field's value. You may also return special pseudo-modifier objects for update operations. Examples are {$inc: 1} and {$push: new Date}.

The following properties and methods are available in this for an autoValue function:

  1. isSet: True if the field is already set in the document or modifier
  2. unset(): Call this method to prevent the original value from being used when you return undefined.
  3. value: If isSet = true, this contains the field's current (requested) value in the document or modifier.
  4. operator: If isSet = true and isUpdate = true, this contains the name of the update operator in the modifier in which this field is being changed. For example, if the modifier were {$set: {name: "Alice"}}, in the autoValue function for the name field, this.isSet would be true, this.value would be "Alice", and this.operator would be "$set".
  5. field(): Use this method to get information about other fields. Pass a field name (schema key) as the only argument. The return object will have isSet, value, and operator properties for that field.
  6. siblingField(): Use this method to get information about other fields that have the same parent object. Works the same way as field(). This is helpful when you use sub-schemas or when you're dealing with arrays of objects.

What is the purpose and signature of the clean method provided by the SimpleSchema package?

SimpleSchema instances provide a clean method that cleans or alters data in a number of ways. It's intended to be called prior to validation to avoid any avoidable validation errors. The clean method takes the object to be cleaned as its first argument and the following optional options as its second argument:

  1. filter: Filter out properties not found in the schema? True by default. This removes any keys not explicitly or implicitly allowed by the schema, which prevents errors being thrown for those keys during validation.
  2. autoConvert: Type convert properties into the correct type where possible? True by default. This helps eliminate unnecessary validation messages by automatically converting values where possible. For example, non-string values can be converted to a String if the schema expects a String, and strings that are numbers can be converted to Numbers if the schema expects a Number.
  3. removeEmptyStrings: Remove keys in normal object or $set where the value is an empty string? True by default.
  4. trimStrings: Remove all leading and trailing spaces from string values? True by default.
  5. getAutoValues: Run autoValue functions and inject automatic and defaultValue values? True by default.
  6. isModifier: Is the first argument a modifier object? False by default.
  7. extendAutoValueContext: This object will be added to the this context of autoValue functions. extendAutoValueContext can be used to give your autoValue functions additional valuable information, such as userId. (Note that operations done using the Collection2 package automatically add userId to the autoValue context already.)

The Collection2 package always calls clean before every insert, update, or upsert:

mySchema.clean(obj); // obj is now potentially changed

What is the purpose of 'named validation context'?

Before you can validate an object against your schema, you need to get a new validation context from the SimpleSchema. A validation context provides reactive methods for validating and checking the validation status of a particular object. It's usually best to use a named validation context. That way, the context is automatically persisted by name, allowing you to easily rely on its reactive methods. To obtain a named validation context, call namedContext(name):

// create a simple schema object
var ss = new SimpleSchema({
    requiredString: {
        type: String
    }
});

// create a named validation context
var ssContext1 = ss.namedContext("userForm");

The first time you request a context with a certain name, it is created. Calling namedContext() is equivalent to calling namedContext("default").

What is the purpose of 'unnamed validation context'?

An unnamed validation context is not persisted anywhere. It can be useful when you need to see if a document is valid but you don't need any of the reactive methods for that context. To obtain an unnamed validation context, call newContext():

var ss = new SimpleSchema({
    requiredString: {
        type: String
    }
});
var ssContext1 = ss.newContext();

How can we validate an object against a schema?

To validate an object against the schema in a validation context, call myContext.validate(obj, options). This method returns true if the object is valid according to the schema or false if it is not. It also stores a list of invalid fields and corresponding error messages in the context object and causes the reactive methods to react. Now you can call myContext.isValid() to see if the object passed into validate() was found to be valid. This is a reactive method that returns true or false.

A schema can be passed as the second argument to Meteor's check() and Match.test() methods from the Check package. check() will throw a Match.Error if the object specified in the first argument is not valid according to the schema.

var mySchema = new SimpleSchema({name: {type: String}});

Match.test({name: "Steve"}, mySchema); // Return true
Match.test({admin: true}, mySchema); // Return false
check({admin: true}, mySchema); // throw a Match.Error

How can we validate only one key in a document?

You may have the need to validate just one key. For this, use myContext.validateOne(obj, key, options). This works the same way as the validate method, except that only the specified schema key will be validated. This may cause all of the reactive methods to react. This method returns true if the specified schema key is valid according to the schema or false if it is not.

What are the options supported by the validate() method and the validateOne() method?

Both validate() and validateOne() accept the following options:

  1. modifier: Are you validating a Mongo modifier object? False by default.
  2. upsert: Are you validating a Mongo modifier object potentially containing upsert operators? False by default.
  3. extendedCustomContext: This object will be added to the this context in any custom validation functions that are run during validation. See the Custom Validation section.

How can we handle validation errors using SimpleSchema?

Call mySimpleSchema.validate(doc) to validate doc against the schema and throw a ValidationError if invalid. This is like check(doc, mySimpleSchema) but without the check dependency and with the ability to pass full schema error details back to a callback on the client.

Call mySimpleSchema.validator() to get a function that calls mySimpleSchema.validate for whatever object is passed to it. This means you can do validate: mySimpleSchema.validator() in the mdg:validated-method package. If you set the clean option to true, then the object will be cleaned before it is validated. If you want to change any of the default cleaning options, you can pass in those, too.

How can we add a custom validation function that is called for all keys in all defined schemas?

To add a custom validation function that is called for all keys in all defined schemas, use SimpleSchema.addValidator(myFunction).

How can we add a custom validation function that is called for all keys for a specific SimpleSchema instance?

To add a custom validation function that is called for all keys for a specific SimpleSchema instance, use mySimpleSchema.addValidator(myFunction).

How can we add a custom validation function that is called for a specific key in a specific schema?

To add a custom validation function that is called for a specific key in a specific schema, use the custom option in the schema definition for that key.

How should we implement our custom validation function using SimpleSchema?

All custom validation functions work the same way and have the same this context:

  1. Do any necessary custom validation, and return a String describing the error type if you determine that the value is invalid. Any non-string return value means the value is valid.
  2. The error type string can be one of the built-in strings or any string you want. If you return a custom string, you'll usually want to define a message for it.
  3. Within the function, this provides the following properties:
    1. key: The name of the schema key (e.g., "addresses.0.street")
    2. genericKey: The generic name of the schema key (e.g., "addresses.$.street")
    3. definition: The schema definition object.
    4. isSet: Does the object being validated have this key set?
    5. value: The value to validate.
    6. operator: The Mongo operator for which we're doing validation. Might be null.
    7. field(): Use this method to get information about other fields. Pass a field name (non-generic schema key) as the only argument. The return object will have isSet, value, and operator properties for that field.
    8. siblingField(): Use this method to get information about other fields that have the same parent object. Works the same way as field(). This is helpful when you use sub-schemas or when you're dealing with arrays of objects.

If you need to do some custom validation on the server and then display errors back on the client, refer to the Asynchronous Custom Validation on the Client section.

How can we manually add a validation error?

If you want to reactively display an arbitrary validation error and it is not possible to use a custom validation function (perhaps you have to call a function onSubmit or wait for asynchronous results), you can add one or more errors to a validation context at any time by calling myContext.addInvalidKeys(errors), where errors is an array of error objects with the following format:

{name: key, type: errorType, value: anyValue}
  1. name: The schema key as specified in the schema.
  2. type: The type of error. Any string you want, or one of the following built-in strings:
    1. required
    2. minString
    3. maxString
    4. minNumber
    5. maxNumber
    6. minDate
    7. maxDate
    8. badDate
    9. minCount
    10. maxCount
    11. noDecimal
    12. notAllowed
    13. expectedString
    14. expectedNumber
    15. expectedBoolean
    16. expectedArray
    17. expectedObject
    18. expectedConstructor
    19. regEx
  3. value: Optional. The value that was not valid. Will be used to replace the [value] placeholder in error messages.

If you use a custom string for type, be sure to define a message for it. (See Customizing Validation Messages).

SimpleSchema.messages({wrongPassword: "Wrong password"});

myContext.addInvalidKeys([{name: "password", type: "wrongPassword"}]);

How can we implement Asynchronous Custom Validation on the Client?

Validation runs synchronously for many reasons, and likely always will. This makes it difficult to wait for asynchronous results as part of custom validation. Here's one example of how you might validate that a username is unique on the client, without publishing all usernames to every client:

username: {
  type: String,
  regEx: /^[a-z0-9A-Z_]{3,15}$/,
  unique: true,
  custom: function () {
    if (Meteor.isClient && this.isSet) {
      Meteor.call("accountsIsUsernameAvailable", this.value, function (error, result) {
        if (!result) {
          Meteor.users.simpleSchema().namedContext("createUserForm").addInvalidKeys([{name: "username", type: "notUnique"}]);
        }
      });
    }
  }
}

Note that we're calling our "accountsIsUsernameAvailable" server method and waiting for an asynchronous result, which is a boolean that indicates whether that username is available. If it's taken, we manually invalidate the username key with a "notUnique" error.

This doesn't change the fact that validation is synchronous. If you use this with an autoform and there are no validation errors, the form would still be submitted. However, the user creation would fail and a second or two later, the form would display the "notUnique" error, so the end result is very similar to actual asynchronous validation. You can use a technique similar to this to work around asynchronicity issues in both client and server code.

How can we get a List of Invalid Keys and Validation Error Messages?

Call myContext.invalidKeys() to get the full array of invalid key data. Each object in the array has two keys:

  1. name: The schema key as specified in the schema.
  2. type: The type of error. One of the required*, min*, max* etc. strings listed at Manually Adding a Validation Error.

This is a reactive method. There is no message property. Once you see what keys are invalid, you can call ctxt.keyErrorMessage(key) to get a reactive message string. If you want to add a message property to the invalidKeys array objects (which would no longer be reactive), you can do:

var ik = ctxt.invalidKeys();
ik = _.map(ik, function (o) {
  return _.extend({message: ctxt.keyErrorMessage(o.name)}, o);
});

What is the purpose of the keyIsInvalid method?

myContext.keyIsInvalid(key) returns true if the specified key is currently invalid, or false if it is valid. This is a reactive method.

What is the purpose of the keyErrorMessage method?

myContext.keyErrorMessage(key) returns the error message for the specified key if it is invalid. If it is valid, this method returns an empty string. This is a reactive method.

What is the purpose of the resetValidation method?

Call myContext.resetValidation() if you need to reset the validation context, clearing out any invalid field messages and making it valid.

What is the purpose of the schema method?

Call MySchema.schema([key]) to get the schema definition object. If you specify a key, then only the schema definition for that key is returned. Note that this may not match exactly what you passed into the SimpleSchema constructor. The schema definition object is normalized internally, and this method returns the normalized copy.

How can we customize validation message?

To customize validation messages, pass a messages object to either SimpleSchema.messages() or mySimpleSchemaInstance.messages(). Instance-specific messages are given priority over global messages. The format of the messages object is:

{
  errorType: message
}

You can also specify override messages for specific fields:

{
  "errorType schemaKey": message
}

For the regEx error type, you must specify a special message array of objects:

{
  "regEx": [
    {msg: "Default Message"},
    {exp: SimpleSchema.RegEx.Url, msg: "You call that a URL?"}
  ],
  "regEx schemaKey": [
    {exp: SimpleSchema.RegEx.Url, msg: "It's very important that you enter a valid URL here"}
  ]
}

The message is a string. It can contain a number of different placeholders between square brackets:

  1. [label] will be replaced with the field label
  2. [min] will be replaced with the minimum allowed value (string length, number, or date)
  3. [max] will be replaced with the maximum allowed value (string length, number, or date)
  4. [minCount] will be replaced with the minimum array count
  5. [maxCount] will be replaced with the maximum array count
  6. [value] will be replaced with the value that was provided to save but was invalid (not available for all error types)
  7. [type] will be replaced with the expected type; useful for the expectedConstructor error type

By way of example, here is what it would look like if you defined the default error messages yourself:

SimpleSchema.messages({
  required: "[label] is required",
  minString: "[label] must be at least [min] characters",
  maxString: "[label] cannot exceed [max] characters",
  minNumber: "[label] must be at least [min]",
  maxNumber: "[label] cannot exceed [max]",
  minDate: "[label] must be on or after [min]",
  maxDate: "[label] cannot be after [max]",
  badDate: "[label] is not a valid date",
  minCount: "You must specify at least [minCount] values",
  maxCount: "You cannot specify more than [maxCount] values",
  noDecimal: "[label] must be an integer",
  notAllowed: "[value] is not an allowed value",
  expectedString: "[label] must be a string",
  expectedNumber: "[label] must be a number",
  expectedBoolean: "[label] must be a boolean",
  expectedArray: "[label] must be an array",
  expectedObject: "[label] must be an object",
  expectedConstructor: "[label] must be a [type]",
  regEx: [
    {msg: "[label] failed regular expression validation"},
    {exp: SimpleSchema.RegEx.Email, msg: "[label] must be a valid e-mail address"},
    {exp: SimpleSchema.RegEx.WeakEmail, msg: "[label] must be a valid e-mail address"},
    {exp: SimpleSchema.RegEx.Domain, msg: "[label] must be a valid domain"},
    {exp: SimpleSchema.RegEx.WeakDomain, msg: "[label] must be a valid domain"},
    {exp: SimpleSchema.RegEx.IP, msg: "[label] must be a valid IPv4 or IPv6 address"},
    {exp: SimpleSchema.RegEx.IPv4, msg: "[label] must be a valid IPv4 address"},
    {exp: SimpleSchema.RegEx.IPv6, msg: "[label] must be a valid IPv6 address"},
    {exp: SimpleSchema.RegEx.Url, msg: "[label] must be a valid URL"},
    {exp: SimpleSchema.RegEx.Id, msg: "[label] must be a valid alphanumeric ID"}
  ],
  keyNotInSchema: "[key] is not allowed by the schema"
});

You should call this method on both the client and the server to make sure that your messages are consistent. You can call this method multiple times, for example to change languages on the fly, and the messages on screen will reactively change. If your message contains a [label] placeholder, the label name reactively updates when changed, too.

Why should we validate and store Dates set to the UTC time zone?

For consistency, you should generally validate and store Dates set to the UTC time zone. If you care only about the date, then use a Date object set to the desired date at midnight UTC. If you need the time, too, then use a Date object set to the desired date and time UTC.

This goes for min and max dates, too. If you care only about the date portion and you want to specify a minimum date, min should be set to midnight UTC on the minimum date (inclusive). Following these rules ensures maximum interoperability with HTML5 date inputs and usually just makes sense.

How can we make a field conditionally required?

If you have a field that should be required only in certain circumstances, first make the field optional, and then use a custom function similar to this:

{
  field: {
    type: String,
    optional: true,
    custom: function () {
      var shouldBeRequired = this.field('saleType').value == 1;

      if (shouldBeRequired) {
        // inserts
        if (!this.operator) {
          if (!this.isSet || this.value === null || this.value === "") return "required";
        }

        // updates
        else if (this.isSet) {
          if (this.operator === "$set" && this.value === null || this.value === "") return "required";
          if (this.operator === "$unset") return "required";
          if (this.operator === "$rename") return "required";
        }
      }
    }
  }
}

Where customCondition is whatever should trigger it being required. In the future we could make this a bit simpler by allowing optional to be a function that returns true or false. Pull request welcome.

How can we validate one key against another key?

Here's an example of declaring one value valid or invalid based on another value using a custom validation function.

SimpleSchema.messages({
  "passwordMismatch": "Passwords do not match"
});

MySchema = new SimpleSchema({
  password: {
    type: String,
    label: "Enter a password",
    min: 8
  },
  confirmPassword: {
    type: String,
    label: "Enter the password again",
    min: 8,
    custom: function () {
      if (this.value !== this.field('password').value) {
        return "passwordMismatch";
      }
    }
  }
});

How can we extend SimpleSchema options?

You may find at some point that there is something extra you would really like to define within a schema for your package or app. However, if you add unrecognized options to your schema definition, you will get an error. To inform SimpleSchema about your custom option and avoid the error, you need to call SimpleSchema.extendOptions. By way of example, here is how the Collection2 package adds the additional schema options it provides:

SimpleSchema.extendOptions({
  index: Match.Optional(Match.OneOf(Number, String, Boolean)),
  unique: Match.Optional(Boolean),
  denyInsert: Match.Optional(Boolean),
  denyUpdate: Match.Optional(Boolean)
});

How can we implement custom input type?

This capability should be coming from the autoform package. Check the documentation for the autoform package for detail. The author of the package (Eric Dobbertin) has created a number of custom input types that come with Autoform, which you see here.

What is the purpose of the aldeed:schema-index package?

It is a Meteor package that allows you to control some MongoDB indexing from your SimpleSchema. This package is currently included automatically with the aldeed:collection2 package.

Use the index option to ensure a MongoDB index for a specific field:

{
  title: {
    type: String,
    index: 1
  }
}

Set to 1 or true for an ascending index. Set to -1 for a descending index. Or you may set this to another type of specific MongoDB index, such as "2d". Indexes work on embedded sub-documents as well.

If you have created an index for a field by mistake and you want to remove or change it, set index to false:

{
  "address.street": {
    type: String,
    index: false
  }
}

If you need to change anything about an index, you must first start the app with index: false to drop the old index, and then restart with the correct index properties.

If a field has the unique option set to true, the MongoDB index will be a unique index as well. Then on the server, Collection2 will rely on MongoDB to check uniqueness of your field, which is more efficient than our custom checking.

{
  "pseudo": {
    type: String,
    index: true,
    unique: true
  }
}

For the unique option to work, index must be true, 1, or -1. The error message for uniqueness is very generic. It's best to define your own using MyCollection.simpleSchema().messages(). The error type string is "notUnique".

You can use the sparse option along with the index and unique options to tell MongoDB to build a sparse index. By default, MongoDB will only permit one document that lacks the indexed field. By setting the sparse option to true, the index will only contain entries for documents that have the indexed field. The index skips over any document that is missing the field. This is helpful when indexing on a key in an array of sub-documents. Learn more in the MongoDB docs.

All indexes are built in the background so indexing does not block other database queries.

What is the purpose of the aldeed:schema-deny package?

It is a Meteor package that allows you to deny inserting or updating certain properties in your database by setting options in your schema. This package is currently included automatically with the aldeed:collection2 package.

If you set denyUpdate: true, any collection update that modifies the field will fail. For instance:

const postSchema = new SimpleSchema({
  title: {
    type: String
  },
  content: {
    type: String
  },
  createdAt: {
    type: Date,
    denyUpdate: true
  }
});

const Posts = new Mongo.Collection('posts');
Posts.attachSchema(postSchema);

const postId = Posts.insert({title: 'Hello', content: 'World', createdAt: new Date()});

The denyInsert option works the same way, but for inserts. If you set denyInsert to true, you will need to set optional: true as well.

What are the things that Collection2 implemented for the 'custom' option of SimpleSchema?

Collection2 adds the following properties to this for any custom function that is called as part of a C2 database operation:

  1. isInsert: True if it's an insert operation
  2. isUpdate: True if it's an update operation
  3. isUpsert: True if it's an upsert operation (either upsert() or upsert: true)
  4. userId: The ID of the currently logged in user. (Always null for server-initiated actions.)
  5. isFromTrustedCode: True if the insert, update, or upsert was initiated from trusted (server) code
  6. docId: The _id property of the document being inserted or updated. For an insert, this will be set only when it is provided in the insert doc, or when the operation is initiated on the client. For an update or upsert, this will be set only when the selector is or includes the _id, or when the operation is initiated on the client.

What is the purpose of the Collection2-Migrations package?

This package will help you manage your DB migrations with regard of aldeed:collection2 and aldeed:simple-schema. You probably should use the bookmd:schema-migrations package instead. https://github.com/davidyaha/meteor-collection2-migrations/

Backup of your current DB is strongly advised. The package will note when there is a change in the schema and after a notable change it will try to auto migrate. Auto migration will do the following for each document on the collection:

  1. Remove any deleted fields from schema.
  2. Use auto or default value to fill in any missing and required fields.
  3. Use auto conversion of field types using the built in ability of collection2.
  4. Auto convert types if possible.

Auto migration will currently not:

  1. Move fields from one collection to another.
  2. Rename a field.
  3. Check for missing ids on field that suppose to relate to another document.
  4. Rebuild indexes.
  5. Auto fill values that fail on a regEx.

All of the above features are planned to be implemented eventually but I will sure appreciate code submissions.

Use the regular attachSchema call:

Books = new Mongo.Collection('books');

var booksV1 = new SimpleSchema({
    name: {
      type: String
    },
    author: {
      type: String
    },
    isbn: { //ISBN 10
      type: String,
      regEx: /ISBN\x20(?=.{13}$)\d{1,5}([- ])\d{1,7}\1\d{1,6}\1(\d|X)$/,
      optional: true,
      unique: true
    }
  }
);

Books.attachSchema(booksV1);

Add custom migration functions to allow for more difficult migrations:

var booksV2 = new SimpleSchema({
    name: {
      type: String
    },
    author: {
      type: String
    },
    isbn: { //ISBN 13
      type: String,
      regEx: /ISBN(?:-13)?:?\x20*(?=.{17}$)97(?:8|9)([ -])\d{1,5}\1\d{1,7}\1\d{1,6}\1\d$/,
      unique: true
    },
    sold: {
      type: Number,
      defaultValue: 0
    }
  }
);
Books.addCustomMigration('migrate isbn-10 to isbn-13', function () {
  Books.find({isbn: { $exists: true }}).forEach(function (doc) {
    var newIsbn = doc.isbn.replace('ISBN', 'ISBN-13 978');
    Books.update({_id: doc._id}, {$set: {isbn: newIsbn}}, {validate: false});
  });
}, true);

Books.attachSchema(booksV2);

What is the purpose of the matb33:collection-hooks package?

Extends Mongo.Collection with before/after hooks for insert, update, remove, find, and findOne. Works across client, server or a mix. Also works when a client initiates a collection method and the server runs the hook, all while respecting the collection validators (allow/deny).

meteor add matb33:collection-hooks

How can we implement the before.insert hook?

.before.insert(userId, doc)

Allows you to modify doc as needed, or run additional functionality. this.transform() obtains transformed version of document, if a transform was defined.

var test = new Mongo.Collection("test");

test.before.insert(function (userId, doc) {
  doc.createdAt = Date.now();
});

How can we implement the .before.update collection hook?

.before.update(userId, doc, fieldNames, modifier, options)

Fired before the doc is updated. Allows you to to change the modifier as needed, or run additional functionality. this.transform() obtains transformed version of document, if a transform was defined.

test.before.update(function (userId, doc, fieldNames, modifier, options) {
  modifier.$set = modifier.$set || {};
  modifier.$set.modifiedAt = Date.now();
});

Note that we are changing modifier, and not doc. Changing doc won't have any effect as the document is a copy and is not what ultimately gets sent down to the underlying update method.

How can we implement the .before.remove collection hook?

.before.remove(userId, doc)

Fired just before the doc is removed. Allows you to to affect your system while the document is still in existence — useful for maintaining system integrity, such as cascading deletes. this.transform() obtains transformed version of document, if a transform was defined.

test.before.remove(function (userId, doc) {
  // ...
});

How can we implement the .before.upsert collection hook?

.before.upsert(userId, selector, modifier, options)

Fired before the doc is upserted. Allows you to to change the modifier as needed, or run additional functionality. this.transform() obtains transformed version of document, if a transform was defined.

test.before.upsert(function (userId, selector, modifier, options) {
  modifier.$set = modifier.$set || {};
  modifier.$set.modifiedAt = Date.now();
});

Note that calling upsert will always fire .before.upsert hooks, but will call either .after.insert or .after.update hooks depending on the outcome of the upsert operation. There is no such thing as a .after.upsert hook at this time.

How can we implement the .after.insert collection hook?

Fired after the doc was inserted. Allows you to run post-insert tasks, such as sending notifications of new document insertions. this.transform() obtains transformed version of document, if a transform was defined; this._id holds the newly inserted _id if available.

test.after.insert(function (userId, doc) {
  // ...
});

How can we pass options to the collection hooks?

As of version 0.7.0, options can be passed to hook definitions. Default options can be specified globally and on a per-collection basis for all or some hooks, with more specific ones having higher specificity. Examples (in order of least specific to most specific):

CollectionHooks.defaults.all.all = {exampleOption: 1};

CollectionHooks.defaults.before.all = {exampleOption: 2};
CollectionHooks.defaults.after.all = {exampleOption: 3};

CollectionHooks.defaults.all.update = {exampleOption: 4};
CollectionHooks.defaults.all.remove = {exampleOption: 5};

CollectionHooks.defaults.before.insert = {exampleOption: 6};
CollectionHooks.defaults.after.remove = {exampleOption: 7};

Similarly, collection-wide options can be defined (these have a higher specificity than the global defaults from above):

var testCollection = new Mongo.Collection("test");

testCollection.hookOptions.all.all = {exampleOption: 1};

testCollection.hookOptions.before.all = {exampleOption: 2};
testCollection.hookOptions.after.all = {exampleOption: 3};

testCollection.hookOptions.all.update = {exampleOption: 4};
testCollection.hookOptions.all.remove = {exampleOption: 5};

testCollection.hookOptions.before.insert = {exampleOption: 6};
testCollection.hookOptions.after.remove = {exampleOption: 7};

Currently (as of 0.7.0), only fetchPrevious is implemented as an option, and is only relevant to after-update hooks.

When implementing a collection hook, what is the meaning of this._super?

All hook callbacks have this._super available to them (the underlying method) as well as this.context, the equivalent of this to the underlying method. Additionally, this.args contain the original arguments passed to the method and can be modified by reference (for example, modifying a selector in a before hook so that the underlying method uses this new selector).

What is the effect of returning false in an before collection hook?

Returning false in any before hook will prevent the underlying method (and subsequent after hooks) from executing. Note that all before hooks will still continue to run even if the first hook returns false.

What is the purpose of the defaultUserId when implementing a collection hook?

You can define a defaultUserId in case you want to pass an userId to the hooks but there is no context. For instance if you are executing and API endpoint where the userId is derived from a token. Just assign the userId to CollectionHooks.defaultUserId. It will be overriden by the userId of the context if it exists.

What is returned when we implement a collection hook?

When adding a hook, a handler object is returned with these methods:

  1. remove(): will remove that particular hook;
  2. replace(callback, options): will replace the hook callback and options.

How can we implement a global before insert collection hook?

Use CollectionHooks.defaults.before.insert will probably work but it seems to be an object instead of a function.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License