Recurring batch job import in AX 2012 R3

I was reading an article about the Data import export framework in the new Dynamics AX (AX 7) and it mentioned that the DIXF can be setup with an input folder. Any file placed in the input folder is processed and imported. Then it was pointed out to me that that function is also available in Dynamics AX 2012 – I’d just never noticed the setup options.

Mostly when we’re using the DIXF we’re setting up a new system, and most of our imports are “one-off’s” (although if you get away with as few as three data import cycles you’re doing well). But there are some business processes that rely on fairly regular data imports – for instance sales forecasts and sales prices are routinely imported into Dynamics AX and count as a ‘business as usual’ BAU import. So you might want to set these BAU imports up based on recurring batch jobs.

I’m doing this example based on a Price discount journal.

First I’ll check/setup my Target entity. Data import export framework > Setup > Target entity:


I’m going to do this example using a CSV file format import – but just a reminder that the DIXF can also import from XML or Excel files.

So next I need a processing group. I create a new processing group and use the Entities button to setup the target entity and import format. Data import export framework > Common > Processing group > Entities:


Next I’m going to define my import format and customise it slightly to simplify the import using the techniques I’ve described before ( here and here ).

I end up with mapping which looks like this:


And an input file which looks like this:


And when I preview my source file I see:


Excellent – my journal number and line number and Relation have defaulted correctly.

Now we are going to set this up as a recurring batch job – and that is done back on the processing group when you Get staging data:


Although you are offered a default Job ID you can enter your own, and a description:


Then click OK:


Now for the clever bit – we have ‘Processing directory’, ‘Completed directory’ and ‘Error directory’ parameters, but to make these active we first have to change the import Type from ‘File’ to ‘Directory’, that changes our ‘File path’ field to be a ‘Folder path’. I’ve got an input folder ready to use, and Processing, Completed, and Error directories:


And just to save another processing step I’m going to tick ‘Execute target step’, so that after my data is imported to the staging table, it’ll be copied to the target table automatically.

Now let’s set this up as a recurring batch job, and put it through its paces. Click Run and the familiar batch job setup form appears – you know what to do:


And of course when we click OK we get:


Incidentally you’ll notice that I’ve got into the habit of embedding the Company name in the job description – just makes it easier to search for. Systems administration > Inquires > Batch jobs:


First, let’s see what happens when the batch job runs and the input folder is empty:


I’m not sure that an error message is called for here, I would have thought an Information messages would suffice, and it looks like there’s already a fix available on LCS for this (KB3161169).

Although in my example the batch job history logs an error, the job keeps recurring, so now let’s place a file in the input folder, and wait patiently for the batch job to run again:


With a file to process we get a nice Infolog. And we can also see that in the Execution history of our Processing group:


The processing group execution history is cleaner than the batch job history, because here we’re only seeing the recurrences which processed files. Incidentally I did another test where I placed two files in the import folder, and both were processed exactly as I would have expected, but I don’t think you can rely on the DIXF processing the files in any particular sequence if you have multiple files.

Finally let’s just go and check that our sales price trade agreement journal got imported and is ready to post.

Sales and marketing > Journals > Price/discount agreement journals:


Perfect!

And of course we have our original input file now sitting in our ‘Completed directory’:


In order to make this a fully finished business process you’ll want to add in another job which will delete the staging data – you could do that as a second task in this batch job; and also you’ll want to tidy up the batch job execution history log by deleting those records after a few days. Housekeeping has become a bit of a hot topic in our office recently, and we’re all paying more attention to keeping the system neat and tidy.

Import an inventory journal with automatic numbering in AX 2012 R3

This post is a follow-up to Import an inventory journal in AX 2012 R3

I’m not going through the whole setup again here, but I’m going to pick it up where I left off and concentrate on the option for Automatic generation of a field value.

Here I’ve setup the Journal number and the Line number as automatic.


As I said in the previous post, that works perfectly for the Line number, and Dynamics AX tries to do the right thing for the Journal number by looking up the number sequence – the problem is it gets carried away and generates a new Journal number for each line. Slightly amusing if you import three lines, not so funny if you import 30,000:


So my original work-around for this was to not use automatic for the Journal number but as Javier Sullivan pointed out in a comment on my previous post, there’s a ‘Query criteria’ option that I’d missed.

I’m in my processing group setup, Data import export framework > Common > Processing group > Entities


And I click on Modify source mapping, and Mapping details, and select the Journal number (JournalID):


Now click on the Query criteria button:


Here we can specify any field in our import file, I’m going to choose the warehouse (InventLocationId):


Then just close the form.

I’ve added a couple of extra lines to my input file – now I have 5 rows, across two warehouses:


On the Processing group I perform the ‘Get staging data’:


And ‘Copy data to target’:


And check my journals, Inventory management > Journals > Movement journals:


And:


Just like it says on the tin!

Import an inventory journal in AX 2012 R3

Pretty much every Dynamics AX implementation I’ve been involved in so far has required the import of stock on-hand at go-live (although as it happens, my current project doesn’t include any inventory management initially – still they’ll get around to that eventually).

I’ve used various methods to import stock on hand in the past – and typically this is done as a movement journal. You probably know that with an Inventory movement journal you can specify the off-set inventory account – whereas with an inventory adjustment journal (or a counting journal) the offset account is defined by the item’s posting profile – typically defined by the item’s Item group.

In Dynamics AX 2012 R3 we have the Data import export framework (DIXF) – and this is a walk-through of importing an inventory movement journal.

First we setup the DIXF parameters. Data import export framework > Setup > Data import /export framework parameters:


You need to have a working directory setup.

Next we’ll setup the import format as a CSV file. Data import export framework > Setup > Source data formats:


The DIXF comes with a number of standard ‘Target entities’ – we need the inventory journal. Data import export framework > Setup > Target entities:


The Target entity defines the format of the Staging table, and maps it to the actual Dynamics AX tables.

I’m working in the AX 2012 R3 CU9 Hyper-V demo image, and to get this to work, I had to cheat a bit. First I opened the AOT and made the field EntityTypeName editable


Then back on the Target entities form I set the ‘Entity’ field:


There’s no reason why you should have to do that – but no point in me missing out a step I took is there?

To use the Target entity we need a Processing group. Go to Data import export framework > Common > Processing group and create a new Processing group:


Click on the ‘Entities’ button and select the ‘Inventory journal’ Target entity and the CSV file format:


Now we can use the ‘Generate source file’ wizard to setup our input file:


I’ve selected the fields I want to import, and arranged them into a logical sequence.

‘Generate sample file’ flicks those fields into Notepad and I can save the file as a CSV file:


Back in the Processing groups’ ‘Select entities for processing group’ form, enter that file as your sample file, and click ‘Generate source mapping’:


You should get:


Now click ‘Validate’. I’m not sure what it does, but why wouldn’t you click a button called ‘Validate’?


Now we’re ready to open our CSV file in Excel and add some data:


And save the CSV file. Incidentally, I’m sure you’ve noticed that after you’ve save the CSV file, when you close Excel it prompts you to save your spreadsheet. I guess you only go into that endless loop once.

Back on the ‘Select entities for processing group’ form, it’s worth clicking ‘Preview source file’ – if that doesn’t work here, you’re not going to get much further:


So far, so good. Back to the Processing group and Click on ‘Get staging data’:


If you’re doing a lot of imports enter a Description, but it’s optional. Click OK:


That’s the default Source file we entered above – it’s the one we want to use, so just click ‘Run’ but we can at this point enter a different import file (but of course it has to have the same format). Run gives us the option to setup a batch job:


Just click Ok to run on-line and you should get:


Nearly there – back on the Processing group click ‘Copy data to target’:


You have to enter/select your import to staging Job ID (and now you can see why it’s a good idea to enter a meaningful Description when you copy data to Staging). OK gives you another Run form:


Not so obvious, but if you have a huge data import this is where you can set it up for multi-threading by the batch job you’re about to create (by specifying a number of Bundles). We’ll press on and click Run:


No surprises – you know what to do. OK gives:


Phew, I was holding my breath. Let’s see if the journal’s there:


What, no Lines? Go the History tab and click on ‘Recalculation’:


That’s better. Inspect the lines:


Optionally, validate, and then Post.


Success!

In a previous post I talked about customising the Processing group mapping to make the import format easier for the user. A user friendly data import framework in Dynamics AX 2012-R3, and one option is to set a field for Automatic generation:


That works perfectly for the Line number, and Dynamics AX tries to do the right thing for the Journal number by looking up the number sequence – the problem is it gets carried away and generates a new Journal number for each line. Slightly amusing if you import three lines, not so funny if you import 30,000: