Publish and synchronize repeated export
Edit on GitHubAutomatic execution of the Publish & Synchronize process does not always resolve all your tasks. For example, you might want to re-synchronize(re-sync) the published data in the key-value store (Redis or Valkey) and Elasticsearch to display updated information in your shop front end. Or you might want to regenerate the published data and re-write the data of the database tables in the Storage
and Search
modules with the subsequent update of key-value store (Redis or Valkey) and Elasticsearch records. This can be done manually by running console commands.
Data re-synchronization
In some cases, you might want to re-export data into key-value store (Redis or Valkey) and Elasticsearch. For example, if Redis has been flushed and the data in the key-value store (Redis or Valkey) and/or Elasticsearch is lost.
Reexport data:
vendor/bin/console sync:data
This command does the following:
- Reads the aggregated data from the database tables of
Storage
andSearch
modules. - Sends the data to the RabbitMQ queues.
- Copies the data from the RabbitMQ queues to the key-value store (Redis or Valkey) and Elasticsearch.
You can specify particular Storage
and Search
tables by specifying entity names as follows:
vendor/bin/console sync:data {resource_name}
For example, the command to re-sync data for CMS Block
looks as follows:
vendor/bin/console sync:data cms_block
To trigger data re-sync for a resource, there must be a corresponding sync plugin created for this resource. To learn how to create it, see Implement synchronization plugins
Published data re-generation
You can regenerate published data from scratch. For example, something went wrong during a product import and you want to re-publish the data. In other words, you need to update Storage
and Search
tables and sync the data in the key-value store (Redis or Valkey) and Elasticsearch.
Regenerate published data:
vendor/bin/console publish:trigger-events
vendor/bin/console event:trigger
is deprecated.
This command does the following:
- Reads data from the
Storage
andSearch
tables and re-writes them - Updates key-value store (Redis or Valkey) and Elasticsearch records.
You can specify particular Storage
and Search
tables by indicating resource names as follows:
vendor/bin/console publish:trigger-events -r {resource_name}
Also, you can add one or more entity IDs:
vendor/bin/console publish:trigger-events -r {resource_name} -i {ids}
For example, the command to regenerate the published data for CMS Block
and Availability
resources with IDs 1 and 2 looks as follows:
vendor/bin/console publish:trigger-events -r cms_block,availability -i 1,2
To trigger data re-publish for a resource, there must be a corresponding publisher plugin created for this resource.
To learn how to create it, see Implement event trigger publisher plugins.
Thank you!
For submitting the form