Continuing from the previous step, this one actually does something useful, namely transfers some stuff from one location to another.

<?xml version="1.0" encoding="UTF-8"?>
<xdtl:package name="loader.test"
  xmlns:xdtl="http://xdtl.org/xdtl" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://xdtl.org/xdtl http://xdtl.org/xdtl/xdtl.xsd">

  <xdtl:parameter name="filename" required="1"></xdtl:parameter>
  <xdtl:parameter name="tablename" required="1"></xdtl:parameter>
  <xdtl:parameter name="dbhost" default="localhost:5432" required="0"></xdtl:parameter>
  <xdtl:parameter name="dbname" default="postgres" required="0"></xdtl:parameter>
  <xdtl:parameter name="dbuser" default="postgres" required="0"></xdtl:parameter>
  <xdtl:parameter name="dbpass" default="" required="0"></xdtl:parameter>

  <xdtl:variable name="tmpdir">/tmp/incoming</xdtl:variable>
  <xdtl:variable name="bakdir">/tmp/incoming/bak</xdtl:variable>
  <xdtl:variable name="nocr">-e 's/\r//g'</xdtl:variable>
  <xdtl:variable name="tempdb">staging</xdtl:variable>

  <xdtl:connection name="local" type="DB">jdbc:postgresql://$dbhost/$dbname?user=$dbuser&amp;password=$dbpass&amp</xdtl:connection>

  <xdtl:tasks>
    <xdtl:task name="load_file">
      <xdtl:steps>
        <xdtl:get source="ftp://www.site.com/$filename.zip" target="$tmpdir" overwrite="1"/>
        <xdtl:unpack source="$tmpdir/$filename.zip" target="$tmpdir" overwrite="1"/>
        <xdtl:strip expr="$nocr" source="$tmpdir/$filename.txt" target="$tmpdir/$filename.tmp" overwrite="1"/>
        <xdtl:read source="$tmpdir/$filename.tmp" target="$tempdb.$filename" connection="$local" type="CSV" overwrite="1" delimiter=";" quote='"'/>
        <xdtl:query cmd="SELECT COUNT(*) FROM pg_tables WHERE tablename='$tablename'" target="exist" connection="$local"/>
        <xdtl:if expr="${exist != 0}">
        <xdtl:query cmd="INSERT INTO $tablename
          SELECT t0.* 
          FROM $tempdb.$filename t0
          LEFT OUTER JOIN $tablename t1 ON t1.id = t0.id
          WHERE t1.id IS NULL" connection="$local"/>
        </xdtl:if>
        <xdtl:move source="$tmpdir/$filename.zip" target="$bakdir/$filename.$xdtlDateCode.rar" overwrite="1"/>
        <xdtl:clear target="$tmpdir/$filename.*"/>
      </xdtl:steps>
    </xdtl:task>
  </xdtl:tasks>
</xdtl:package>

In short, a text file is downloaded (get), unzipped (unpack), stripped from CR's with sed (strip) and gets loaded into staging area (read). If the target table exists the rows that are not in the target table already get INSERTed. Finally, incoming file is archived away (move) and the temp files are trashed (clear). All in all, a typical plain vanilla data warehousing task. At this point, some explanations are probably needed.
 
1. The required parameters, filename and tablename get their values in some other (calling) package outside this one. If not, the package get an error.  The rest of the parameters are optional, ie. they might get their values from a calling procedure, if not they fall back to the defaults. 
 
2. The external utilities (get, unpack, strip etc.) get their context in the global configuration file (resources/globals.xml), like this:
 
<entry key="get">wget %target:-O :% %overwrite::-nc% %source%</entry>
 
%target%, %overwrite% and %source% are evaluated to the actual get element attribute values. 
 
3. xdtlDateCode is a variable that was evaluated to the current date in the startup.js script (there can be an arbitrary number of such variable definitions): 
 
var xdtlDateCode = java.lang.String.format("%1$tY%1$tm%1$td", new Array(new java.util.Date()));
 
To be continued...