Add initial/basic docs for batches

that still needs a lot yet i'm sure, but is a start
This commit is contained in:
Lance Edgar 2021-01-16 16:13:52 -06:00
parent da9823324d
commit e5e642ad8b
6 changed files with 260 additions and 6 deletions

View file

@ -4,4 +4,5 @@ App Handlers
TODO
* Report Handler
* :doc:`/data/reports/handler`
* :doc:`/data/batch/handlers`

View file

@ -3,4 +3,73 @@
Adding a Custom Batch
=====================
TODO
If none of the native batch types fit your needs, you can always add a new one.
See :doc:`native` for the list of what's available.
Adding a new batch type involves 3 steps:
* add the batch table schema
* add the batch handler
* add web views for user interaction
For sake of example here, we'll define a "Please Come Back" batch. The purpose
of this batch is to identify members who've not shopped in the store within the
last 6 months, and generate custom coupons for each, to encourage them to come
back.
Adding the batch table schema is not much different than adding any other table
to the DB; see also :doc:`/data/db/extend`. Note that you should always prefix
table names with something unique to your app, and generally speaking the word
"batch" should go in there somewhere too. For instance we might call them:
* ``poser_batch_plzcomeback``
* ``poser_batch_plzcomeback_row``
To add the batch tables, create a new Python module. This would go in
``~/src/poser/poser/db/model/batch/plzcomeback.py`` for instance. Within that
module you then define the batch table classes. Be sure to also define the
``batch_key`` for the main table, which should also get the app-specific
prefix, e.g.::
from rattail.db.model import Base, BatchMixin, BatchRowMixin
class PleaseComeBack(BatchMixin, Base):
__tablename__ = 'poser_batch_plzcomeback'
__batchrow_class__ = 'PleaseComeBackRow'
batch_key = 'poser_plzcomeback'
model_title = "Please Come Back Batch"
model_title_plural = "Please Come Back Batches"
class PleaseComeBackRow(BatchRowMixin, Base):
__tablename__ = 'poser_batch_plzcomeback_row'
__batch_class__ = PleaseComeBack
model_title = "Please Come Back Batch Row"
To add the batch handler, create a new Python module. This would go in
``~/src/poser/poser/batch/plzcomeback.py`` for instance. Within that module
you then define the batch handler class. Be sure to tie this back to the main
batch table, e.g.::
from rattail.batch import BatchHandler
from poser.db import model
class PleaseComeBackHandler(BatchHandler):
batch_model_class = model.PleaseComeBack
To add the web views, also create a new Python module. This one would go in
``~/src/poser/poser/web/views/batch/plzcomeback.py`` for instance. Within that
module you then define the batch master view class. Be sure to tie this back
to both the batch and row tables, as well as the handler, e.g.::
from tailbone.views.batch import BatchMasterView
from poser.db import model
class PleaseComeBackView(BatchMasterView):
model_class = model.PleaseComeBack
model_row_class = model.PleaseComeBackRow
default_handler_spec = 'poser.batch.plzcomeback:PleaseComeBackHandler'
route_prefix = 'batch.poser_plzcomeback'
url_prefix = '/batches/please-come-back'
def includeme(config):
PleaseComeBackView.defaults(config)

View file

@ -3,4 +3,17 @@
Batch Handlers
==============
TODO
A "batch handler" contains logic to handle the entire life cycle of a batch.
Meaning, the batch handler is what will actually create the batch, and update
it per user edits, refresh it etc. and ultimately execute it.
The UI exposes tools for interacting with the batch but ultimately the handler
is what "acts on" the batch based on user input.
See also :doc:`/base/handlers` for general info on handlers.
Each type of batch will have precisely one "handler" associated with it. For
instance, there is a handler for inventory batches, and another for label
batches, etc. The app can override the handler for each batch type, should
they need to customize behavior. And of course if the app adds a new batch
type then it must also provide a handler for it.

View file

@ -3,4 +3,17 @@
Native Batch Types
==================
TODO
The following batch types are provided by Rattail itself. Listed below for
each is the "batch type key" followed by the descriptive name.
* ``custorder`` - Customer Order Batch
* ``handheld`` - Handheld Batch
* ``importer`` - Importer Batch
* ``inventory`` - Inventory Batch
* ``labels`` - Label Batch
* ``newproduct`` - New Product Batch
* ``pricing`` - Pricing Batch
* ``product`` - Product Batch
* ``purchase`` - Purchasing Batch
* ``vendor_catalog`` - Vendor Catalog
* ``vendor_invoice`` - Vendor Invoice

View file

@ -3,4 +3,140 @@
Overview
==========
TODO
What is a "batch" anyway? That seems to be one of those terms which means
something specific in various contexts, but the meanings don't always line up.
So let's define it within the Rattail context:
A "batch" is essentially any set of "pending" data, which must be "reviewed" by
an authorized user, and ultimately then "executed" - which causes the data set
to go into production by way of updating the various systems involved.
This means that if you do not specifically need a review/execute workflow, then
you probably do not need a batch. It also means that if you *do* need a batch,
then you will probably also need a web app, to give the user an actual way to
review/execute. (See also :doc:`/web/index`.) Most of the interesting parts
of a batch however do belong to the Data Layer, so are described here.
Beyond that vague conceptual definition, a "batch" in Rattail also implies the
use of its "batch framework" - meaning there is a "batch handler" involved etc.
Batch Workflow
--------------
All batches have the following basic workflow in common:
* batch is created
* if applicable, batch is initially populated from some data set
* batch is reviewed and if applicable, further modified
* batch is executed
In nearly all cases, the execution requires user intervention, meaning a batch
is rarely "created and executed" automatically.
Batch vs. Importer
------------------
Let's say you have a CSV of product data and you wish to import that to the
``product`` table of your Rattail DB. Most likely you will use an "importer"
only, in which case the data goes straight in.
But let's say you would prefer to bring the CSV data into a "batch" within the
DB instead, and then a user must review and execute that batch in order for the
data to truly update your ``product`` table.
That is the essential difference between an importer and a batch - a batch will
always require user intervention (review/execute) whereas a plain importer
never does.
Now, that said, what if you have a working importer but you would rather
"interrupt" its process and introduce a user review/execute step before the
data was truly imported to the target system? Well that is also possible, and
we refer to that as an "importer batch" (creative, I know).
..
Batch Archetypes
----------------
Here are what you might call "archetypes" for all possible batches.
Batch from Importer
-------------------
aka. "importer batch"
This is relatively less common in practice, but it serves as a good starting
point since we already mentioned it.
With an "importer batch" the data in question always comes from whatever the
"source" (host) is for the importer. The schema for the batch data table is
kept dynamic so that "any" importer can produce a batch, theoretically. Each
importer run will produce a new table in the DB, columns for which will depend
on what was provided by the importer run. Such dynamic batch data tables are
kept in a separate (``batch``) schema within the DB, to avoid cluttering the
``default`` schema. (Note that the ``batch`` schema must be explicitly created
if you need to support importer batches.)
The execution step for this type of batch is also relatively well-defined.
Since the batch itself is merely "pausing" the processing which the importer
normally would do, when the batch is executed it merely "unpauses" and the
remainder of the importer logic continues. The only other difference being,
the importer will not read data from its normal "source" but instead will
simply read data from the batch - but from there it will continue as normal to
write data to the target/local system.
Batch from User-Provided File
-----------------------------
More typically a batch will be "created" when a user uploads a data file. With
this type of batch, the data obviously comes from the uploaded file, but it may
be supplemented via SQL lookups etc. The goal being, take what the user gave
us but then load the "full picture" into the batch data tables.
User then will review/execute as normal. What such a batch actually does when
executed, can vary quite a bit.
Note that in some cases, a user-provided file can be better handled via
importer only, with no batch at all unless you need the review/execute
workflow.
Batch from User-Provided Query
------------------------------
In this pattern the user starts by filtering e.g. the Products table grid
according to their needs. When they're happy with results, they create a new
batch using that same query. So this defines the data set for the batch - it
is simply all products matching the query.
Note that Products is the only table/query supported for this method thus far.
It is possible to create multiple batch types with this method.
User then will review/execute as normal. What such a batch actually does when
executed, can vary quite a bit. A common example might be to create a Labels
batch from product query, execution of which will print labels for all products
in the batch.
Batch from Custom Data Set
--------------------------
In this pattern the logic used to obtain the data is likely hard-coded within
the batch handler.
For instance let's say you need a batch which allowed the user to generate
"Please Come Back!" coupons for all active members who have not shopped in the
store in the past 6 months.
You should not require the user to provide a data file etc. because the whole
point of this batch is to identify relevant members and issue coupons. If the
batch itself can identify the members then no need for the user to do that.
So here the batch handler logic would a) create a new empty batch, then b)
identify members not seen in 6 months, adding each as a row to the batch,
then c) user reviews and executes per usual, at which point d) coupons are
issued.

View file

@ -3,4 +3,26 @@
Batch Table Schema
==================
TODO
From a schema standpoint there are 2 types of batches generally speaking:
static and dynamic.
Static batch types have 2 "dedicated" tables within the ``default`` schema of
the DB: one table for the batch "headers" and another for its "data". For
instance the basic inventory batch uses these tables:
* ``batch_inventory``
* ``batch_inventory_row``
When a new inventory batch is created, 1 new ``batch_inventory`` record is
created but multiple new ``batch_inventory_row`` records may be created. The
latter table contains all rows for all inventory batches.
Dynamic batch types however are just that, and do not have a dedicated "data"
table, although they still do have a dedicated "header" table. For instance
this approach is used for "importer batches" - when running an importer creates
a batch, it will create 1 new ``batch_importer`` record but then will create a
brand new table with custom schema based on the importer, to hold the data.
To avoid cluttering the ``default`` schema within the DB, dynamic batch data
tables are not stored there. Instead they are kept within the ``batch``
schema, which must be manually created if you need it.