[Dive4elements-commits] [PATCH] Merge with default branch
Wald Commits
scm-commit at wald.intevation.org
Fri Mar 22 11:26:05 CET 2013
# HG changeset patch
# User Christian Lins <christian.lins at intevation.de>
# Date 1363947954 -3600
# Branch mapgenfix
# Node ID 61bf64b102bcc89f798eff47fe1dc0eeeb8d6b49
# Parent cfc5540a4eecb3429c3dada2831b72953ece4d27
# Parent 02f6741f80d44cbb82a874f01925ae785d34dc2b
Merge with default branch
diff -r cfc5540a4eec -r 61bf64b102bc .hgtags
--- a/.hgtags Wed Mar 06 14:14:15 2013 +0100
+++ b/.hgtags Fri Mar 22 11:25:54 2013 +0100
@@ -26,3 +26,13 @@
859278918eb14a8687fef60f2b33dcf89fe71f90 2.9.9
859278918eb14a8687fef60f2b33dcf89fe71f90 2.9.9
53be7313310416e1f8c3e0ec414684ca9c6c71df 2.9.9
+f459911fdbfbe2b2d23e06faba4e338514dd7b54 2.9.10
+f459911fdbfbe2b2d23e06faba4e338514dd7b54 2.9.10
+8c65acf01adc7083c5936d0f8acf67374c97140b 2.9.10
+42bb6ff78d1b734341732772ab24db2a913311b0 2.9.11
+3b86bf214d53da51d85cd8c8ecfeec71aa9da9e4 2.9.12
+3b86bf214d53da51d85cd8c8ecfeec71aa9da9e4 2.9.12
+0000000000000000000000000000000000000000 2.9.12
+0000000000000000000000000000000000000000 2.9.12
+88e3473a38467e8b5bb7d99e92c3f1a795515bf5 2.9.12
+7fa94b793cbe0133503741e142832c8f2ff1aa4b 2.9.13
diff -r cfc5540a4eec -r 61bf64b102bc contrib/init.d/README.txt
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/contrib/init.d/README.txt Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,15 @@
+SLES-Init-Script fuer Dive4Elements River:
+
+Installation als root:
+- Kopieren nach /etc/init.d/d4e-river
+- chmod 755 /etc/init.d/d4e-river
+- insserv /etc/init.d/d4e-river
+- /etc/init.d/d4e-river start
+
+Deinstallation als root:
+- /etc/init.d/d4e-river stop
+- insserv -r /etc/init.d/d4e-river
+- rm /var/log/d4e-river.log /var/run/d4e-river.pid /etc/init.d/d4e-river
+
+TODO:
+- ggf. logrotate fuer Logdatei /var/log/d4e-river.log konfigurieren
diff -r cfc5540a4eec -r 61bf64b102bc contrib/init.d/d4e-river
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/contrib/init.d/d4e-river Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,70 @@
+#!/bin/bash
+#
+### BEGIN INIT INFO
+# Provides: d4e-server
+# Required-Start: $network $syslog $remote_fs
+# Should-Start: $named $syslog $time
+# Required-Stop: $network $syslog
+# Should-Stop: $named $syslog $time
+# Default-Start: 3 5
+# Default-Stop: 0 1 2 6
+# Short-Description: Dive4Elements server
+# Description: Start Dive4Elements server
+### END INIT INFO
+
+RUNAS=flys
+DIR="/opt/flys/current/server"
+
+CLASSPATH=
+for l in `find "$DIR/bin/lib" -name \*.jar -print`; do
+ CLASSPATH=$CLASSPATH:$l
+done
+
+
+LOGFILE=/var/log/d4e-river.log
+PIDFILE=/var/run/d4e-river.pid
+ARGS="-Xmx256m \
+ -server \
+ -Djava.awt.headless=true \
+ -Dflys.datacage.recommendations.development=false \
+ -Djava.io.tmpdir=\"$DIR/cache\" \
+ -Dflys.backend.enablejmx=true \
+ -Dflys.uesk.keep.artifactsdir=false \
+ -Dwsplgen.bin.path=\"$DIR/bin/wsplgen.exe\" \
+ -Dwsplgen.log.output=false \
+ -Dartifact.database.dir=\"$DIR/conf\""
+MAINCLASS=de.intevation.artifactdatabase.App
+
+# For SELinux we need to use 'runuser' not 'su'
+if [ -x "/sbin/runuser" ]; then
+ SU="/sbin/runuser"
+else
+ SU="/bin/su"
+fi
+
+case "$1" in
+ start)
+ echo "Starting D4E-river server..."
+ $SU - $RUNAS -c "/usr/bin/java -classpath $CLASSPATH $ARGS $MAINCLASS" &> $LOGFILE &
+ PID=$!
+ echo $PID > $PIDFILE
+ ;;
+ stop)
+ echo "Stopping D4E-river server..."
+ PID=`cat $PIDFILE`
+ STOPRES=0
+ while [ $STOPRES -le 0 ]
+ do
+ kill -15 $PID &> /dev/null
+ STOPRES=$?
+ sleep 1
+ done
+ echo "done."
+ ;;
+ restart)
+ $0 stop && $0 start
+ ;;
+ *)
+ echo "Usage: $0 [start|stop|restart]"
+esac
+
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/README
--- a/contrib/make_flys_release/README Wed Mar 06 14:14:15 2013 +0100
+++ b/contrib/make_flys_release/README Fri Mar 22 11:25:54 2013 +0100
@@ -1,34 +1,42 @@
Konfiguration:
==============
-Im `confs` Verzeichnis liegen Konfigurationsdateien, die für jede FLYS
-Installation angepasst werden müssen (Ports, Hosts, Datenbank-Connection, etc).
+Zur konfiguration des make_release scripts können umgebungsvariablen
+verwendet werden oder man ändert die entsprechenden Variablen im Script.
-In der `make_flys_release.sh` kann über die Variable `RELEASE` ein TAG aus dem
-HG Repository ausgewählt werden, welches für den Bau von FLYS verwendet werden
-soll.
+Wichtige variablen sind:
+FLYS_SOURCE_DIR
+TOMCAT_PORT
+MAPSERVER_URL
+WIKI_URL
+LOG_DIR
+DEVELOPER
+DEFAULT_WD
-AuÃerdem muss in der `make_flys_release` eingestellt werden, ob man FLYS für
-eine Oracle oder Postgresql Datenbank bauen will. Im Verzeichnis sind
-spezifische Libraries im `libs_oracle` und `libs_postgresql` Verzeichnis
-enthalten. In der `make_flys_release` muss zurzeit in Zeile 71-77 eingestellt
-werden, welche Libs (Oracle / Postgresql) wieder aus dem Zielverzeichnis
-entfernt werden sollen.
+# Seddb Configuration
+SEDDBURL
+SEDDBPORT
+SEDDBBACK
+SEDDBUSER
+SEDDBPASS
-TODOS:
-======
-- auf return Codes der einzelnen Aufrufe (mvn package, etc) reagieren, und den
- Bau ggf abbrechen
-- Konfig-Option für den Bau für Oracle und Postgresql integrieren.
-- Libs für Postgresql / Oracle besser in den Build-Prozess integrieren
+# Backend configuration
+BACKENDURL
+BACKENDPORT
+BACKENDBACK
+BACKENDUSER
+BACKENDPASS
Prozess:
========
Nachdem die Konfigurationen angepasst wurden, kann das Skript mittels
- sh make_release.sh
+ sh make_release.sh VERSION
von der Konsole gestartet werden. AnschlieÃend werden die Quellen des
dive4elements, des HTTP-Clients und von FLYS über SSH aus dem HG Repository
-ausgecheckt. In der `make_flys_release.sh` ist dazu der Name des HG Users in der
-zweiten Zeile anzupassen. Die Quellen werden anschlieÃend mit Maven gebaut.
+ausgecheckt und in FLYS_SOURCE_DIR abgelegt.
+
+Wenn mit der option -t zusätzlich ausgewählt wird diese version zu taggen
+muss in der make_flys_release.sh der entsprechende accountname zum pushen
+des tags als DEVELOPER angegeben werden.
Für den Client wird OpenLayers-2.11 heruntergeladen und in den Client
verschoben. Zurzeit wird das komplette OpenLayers-2.11 Verzeichnis in den Client
@@ -38,3 +46,41 @@
`server` und `client`. Im Server sind alle Konfigurationen sowie notwendige
Bibliotheken zum Starten des FLYS Servers enthalten. Im Client ist lediglich das
WAR Archiv für einen Servlet Container (z.B. Tomcat) enthalten.
+
+Importer:
+=========
+Das script um den Importer zu bauen und zu paketieren liegt unter
+bin/make-importer-package.sh
+Dieses muss man anpassen und ein paar pfade setzen
+
+Wenn man ein "Standalone" Paket bauen möchte kann man diesem script
+einen Parameter übergeben an welchem sich ein tarball befindet
+der mit ins importer paket gepackt werden soll. Dieser Tarball
+kann abhängigkeiten (gdal / proj / oracle) enthalten.
+Das skript um diesen tarball für sles zu erstellen ist
+bin/make-opt-package.sh
+
+Deployment:
+===========
+Der tarball kann auf ein Zielsystem übertragen werden und dort entpackt werden.
+Bei den testsystemen der Intevation ist der Ort der Flys installationen
+üblicherweise /opt/flys/flys-version
+
+AnschlieÃend deployt man den flys-client im webapps verzeichnis von tomcat
+(z.b. /usr/share/tomcat6/webapps )
+ggf. in WEB-INF die web.xml überprüfen / anpassen.
+
+Bei einer konfiguration mit apache vhosts ist nun noch ein entsprechender
+vhost in der apache konfiguration einzurichten.
+
+AnschlieÃend muss man noch sicher stellen das passende wms scripte im
+mapserver verfügbar sind. In /srv/www/cgi-bin müssen entsprechende
+river-wms und user-wms dateien liegen die auf die korrekte aktuelle version verweisen.
+Die WMS urls sind in server/conf/floodmap.xml und server/conf/rivermap.xml konfiguriert.
+
+In server/conf/conf.xml muss dgm-path angepasst werden um an die richtige stelle
+zu zeigen an der im dateisystem die dgm's liegen.
+Wichtig: Der Pfad muss mit einem / enden
+
+Nun kann man den server starten. Dazu in das entsprechende server verzeichnis wechseln
+und ./bin/run ausführen. Der server muss mit diesem arbeitsverzeichnis gestartet werden.
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/bin/make-importer-package.sh
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/contrib/make_flys_release/bin/make-importer-package.sh Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,59 @@
+#!/bin/bash
+set -e
+
+# See README for more information
+
+# The working directory. Resulting tarball will be placed in the directory above.
+PKG_DIR=/tmp/flys-importer
+# Default conf
+CONF_DIR=/path/to/conf/dir
+# Path to the flys checkout
+FLYS_DIR=/path/to/flys/root
+# Tarball that will be extracted into flys-imprter/opt
+EXTRAS=$1
+
+rm -fr $PKG_DIR
+mkdir -p $PKG_DIR/hydr_morph
+mkdir -p $PKG_DIR/geodaesie
+mkdir -p $PKG_DIR/opt/lib64
+mkdir -p $PKG_DIR/schema
+mkdir -p $PKG_DIR/conf
+
+cat > "$PKG_DIR/conf/log4j.properties" << "EOF"
+log4j.rootLogger=DEBUG, IMPORTER
+log4j.appender.IMPORTER.layout=org.apache.log4j.PatternLayout
+log4j.appender.IMPORTER.layout.ConversionPattern=%d [%t] %-5p %c - %m%n
+log4j.appender.IMPORTER=org.apache.log4j.RollingFileAppender
+log4j.appender.IMPORTER.File=./import.log
+log4j.appender.IMPORTER.MaxFileSize=100000KB
+log4j.appender.IMPORTER.MaxBackupIndex=10
+EOF
+
+cd ${FLYS_DIR}/flys-backend
+mvn -f pom-oracle.xml clean compile assembly:single
+cp target/flys-backend-1.0-SNAPSHOT-jar-with-dependencies.jar \
+ $PKG_DIR/hydr_morph/importer.jar
+mvn -f pom.xml clean compile assembly:single
+cp target/flys-backend-1.0-SNAPSHOT-jar-with-dependencies.jar \
+ $PKG_DIR/hydr_morph/importer_psql.jar
+cp ${FLYS_DIR}/flys-backend/contrib/shpimporter/*.py $PKG_DIR/geodaesie
+cp ${FLYS_DIR}/flys-backend/contrib/run_geo.sh \
+ ${FLYS_DIR}/flys-backend/contrib/run_hydr_morph.sh \
+ ${FLYS_DIR}/flys-backend/contrib/import_river.sh \
+ $PKG_DIR
+cp ${FLYS_DIR}/flys-backend/doc/annotation-types.xml $PKG_DIR/conf
+if [ -f "$EXTRAS" ]; then
+ cd $PKG_DIR
+ tar -xzf "$EXTRAS"
+fi
+
+cp ${FLYS_DIR}/flys-backend/doc/schema/*.sql $PKG_DIR/schema
+cp ${FLYS_DIR}/flys-backend/doc/documentation/de/importer-manual.pdf $PKG_DIR
+
+sed -i 's/shpimporter\/shp/geodaesie\/shp/' $PKG_DIR/run_geo.sh
+
+cd $PKG_DIR/..
+DATE=$(date +%Y%m%d%H%m)
+tar -czf flys-importer_${DATE}.tar.gz flys-importer
+sha1sum flys-importer_${DATE}.tar.gz > flys-importer_${DATE}.tar.gz.sha1
+echo Package is at: `readlink -f flys-importer_${DATE}.tar.gz`
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/bin/make-opt-package.sh
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/contrib/make_flys_release/bin/make-opt-package.sh Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,86 @@
+# Required packages are the build essential stuff make gcc etc.
+# and:
+# postgresql-devel libexpat-devel python-devel
+set -e
+# This script is intended to be run on suse enterprise linux
+
+# Path to the oracle zip archives
+ORACLE_LOC=/home/intevation
+# Path to the Source tarballs of gdal-1.9.2.tar.gz proj-4.8.0.tar.gz cx_Oracle-5.1.2.tar.gz
+SOURCES=/home/intevation/Downloads
+#mkdir -p $SOURCES
+#cd $SOURCES
+#wget http://download.osgeo.org/gdal/gdal-1.9.2.tar.gz
+#wget http://download.osgeo.org/proj/proj-4.8.0.tar.gz
+#wget http://downloads.sourceforge.net/project/cx-oracle/5.1.2/cx_Oracle-5.1.2.tar.gz
+
+DEVELDIR=/tmp/gdalbuild
+INSTALL_PREFIX=$DEVELDIR/opt
+export ORACLE_HOME=$DEVELDIR/opt/instantclient_11_2
+export LD_LIBRARY_PATH=$ORACLE_HOME/lib:$LD_LIBRARY_PATH
+
+rm -rf $DEVELDIR
+mkdir -p $DEVELDIR
+
+mkdir -p $SOURCES
+cd $SOURCES
+wget http://download.osgeo.org/gdal/gdal-1.9.2.tar.gz
+wget http://download.osgeo.org/proj/proj-4.8.0.tar.gz
+wget http://downloads.sourceforge.net/project/cx-oracle/5.1.2/cx_Oracle-5.1.2.tar.gz
+
+
+# Oracle
+unzip $ORACLE_LOC/instantclient-basic-linux-x86-64-11.2.0.2.0.zip -d $DEVELDIR/opt
+unzip $ORACLE_LOC/instantclient-sdk-linux-x86-64-11.2.0.2.0.zip -d $DEVELDIR/opt
+unzip $ORACLE_LOC/instantclient-sqlplus-linux-x86-64-11.2.0.2.0.zip -d $DEVELDIR/opt
+mkdir $ORACLE_HOME/lib
+cd $ORACLE_HOME/lib
+ln -s ../libclntsh.so.11.1 .
+ln -s ../libclntsh.so.11.1 libclntsh.so
+ln -s ../libnnz11.so .
+ln -s ../libocci.so.11.1 .
+ln -s ../libocci.so.11.1 libocci.so
+ln -s ../libociei.so .
+ln -s ../libocijdbc11.so .
+ln -s ../libsqlplusic.so .
+ln -s ../libsqlplus.so .
+cd $ORACLE_HOME
+ln -s libclntsh.so.11.1 libclntsh.so
+
+cd $DEVELDIR
+tar -xf $SOURCES/proj-4.8.0.tar.gz
+cd proj-4.8.0
+./configure --prefix=$INSTALL_PREFIX && make && make install
+
+
+cd $DEVELDIR
+tar -xf $SOURCES/gdal-1.9.2.tar.gz
+cd gdal-1.9.2
+patch -l -p0 << "EOF"
+Index: ogr/ogrsf_frmts/oci/ogrocitablelayer.cpp
+===================================================================
+--- ogr/ogrsf_frmts/oci/ogrocitablelayer.cpp (revision 25700)
++++ ogr/ogrsf_frmts/oci/ogrocitablelayer.cpp (working copy)
+@@ -264,7 +264,7 @@
+ char **papszResult;
+ int iDim = -1;
+
+- oDimCmd.Append( "SELECT COUNT(*) FROM ALL_SDO_GEOM_METADATA u," );
++ oDimCmd.Append( "SELECT COUNT(*) FROM USER_SDO_GEOM_METADATA u," );
+ oDimCmd.Append( " TABLE(u.diminfo) t" );
+ oDimCmd.Append( " WHERE u.table_name = '" );
+ oDimCmd.Append( osTableName );
+EOF
+LDFLAGS="-Wl,--no-as-needed" ./configure --with-python --with-oci=$ORACLE_HOME \
+ --prefix=$INSTALL_PREFIX && make && make install
+
+cd $DEVELDIR
+tar -xf $SOURCES/cx_Oracle-5.1.2.tar.gz
+cd cx_Oracle-5.1.2
+python setup.py build
+python setup.py install --prefix=$INSTALL_PREFIX
+
+cd $DEVELDIR
+tar -czf flys-importer-opt.tar.gz opt
+echo "Package is:"
+readlink -f flys-importer-opt.tar.gz
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/confs/artifact-db.xml
--- a/contrib/make_flys_release/confs/artifact-db.xml Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-<?xml version="1.0" encoding="UTF-8" ?>
-<database>
- <user>SA</user>
- <password></password>
- <url>jdbc:h2:${artifacts.config.dir}/../artifactsdb/artifacts</url>
-</database>
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/confs/backend-db.xml
--- a/contrib/make_flys_release/confs/backend-db.xml Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,18 +0,0 @@
-<?xml version="1.0" encoding="UTF-8" ?>
-<backend-database>
-
- <user>flys293</user>
- <password>flys293</password>
- <dialect>org.hibernate.dialect.PostgreSQLDialect</dialect>
- <driver>org.postgresql.Driver</driver>
- <url>jdbc:postgresql://czech-republic.atlas.intevation.de:5432/flys293</url>
-
- <!--
- <user>flys27</user>
- <password>flys27</password>
- <dialect>org.hibernatespatial.oracle.OracleSpatial10gDialect</dialect>
- <driver>oracle.jdbc.driver.OracleDriver</driver>
- <url>jdbc:oracle:thin:@//czech-republic.atlas.intevation.de:1521/XE</url>
- -->
-
-</backend-database>
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/confs/datacage-db.xml
--- a/contrib/make_flys_release/confs/datacage-db.xml Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-<?xml version="1.0" encoding="UTF-8" ?>
-<datacage>
- <user>SA</user>
- <password></password>
- <url>jdbc:h2:${artifacts.config.dir}/../datacagedb/datacage</url>
-</datacage>
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/confs/floodmap.xml
--- a/contrib/make_flys_release/confs/floodmap.xml Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-<?xml version="1.0" encoding="UTF-8" ?>
-<floodmap>
- <shapefile-path value="${artifacts.config.dir}/../shapefiles"/>
- <mapserver>
- <server path="http://czech-republic.intevation.de/cgi-bin/flys-default"/>
- <mapfile path="${artifacts.config.dir}/../flys.map"/>
- <templates path="${artifacts.config.dir}/mapserver/"/>
- <map-template path="mapfile.vm"/>
- </mapserver>
-
- <velocity>
- <logfile path="${artifacts.config.dir}/../velocity_log.log"/>
- </velocity>
-
- <river name="Saar">
- <srid value="31466"/>
- <river-wms url="http://czech-republic.intevation.de/cgi-bin/user-wms" layers="FLYS-Map"/>
- <background-wms url="http://osm.wheregroup.com/cgi-bin/osm_basic.xml?" layers="OSM_Basic"/>
- </river>
- <river name="Mosel">
- <srid value="31466"/>
- <river-wms url="http://czech-republic.intevation.de/cgi-bin/user-wms" layers="FLYS-Map"/>
- <background-wms url="http://osm.wheregroup.com/cgi-bin/osm_basic.xml?" layers="OSM_Basic"/>
- </river>
- <river name="Elbe">
- <srid value="31467"/>
- <river-wms url="http://czech-republic.intevation.de/cgi-bin/elbe-wms"/>
- <background-wms url="http://osm.wheregroup.com/cgi-bin/osm_basic.xml?" layers="OSM_Basic"/>
- </river>
-</floodmap>
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/confs/mapserver/fontset.txt
--- a/contrib/make_flys_release/confs/mapserver/fontset.txt Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,2 +0,0 @@
-FreeSans /usr/share/splashy/themes/default/FreeSans.ttf
-DefaultFont /usr/share/splashy/themes/default/FreeSans.ttf
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/confs/rest-server.xml
--- a/contrib/make_flys_release/confs/rest-server.xml Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,6 +0,0 @@
-<?xml version="1.0" encoding="UTF-8" ?>
-<rest-server>
- <!-- The port which the ArtifactDatabase (ArtifactServer) will bind to. -->
- <port>8999</port>
- <listen>localhost</listen>
-</rest-server>
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/confs/rivermap.xml
--- a/contrib/make_flys_release/confs/rivermap.xml Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,30 +0,0 @@
-<?xml version="1.0" encoding="UTF-8" ?>
-<!--// configuration fragment for static river WMS //-->
-<rivermap>
- <mapserver>
- <server path="http://example.com/cgi-bin/"/>
- <mapfile path="${artifacts.config.dir}/../rivers.map"/>
- <templates path="${artifacts.config.dir}/mapserver/"/>
- <map-template path="river-mapfile.vm"/>
- </mapserver>
-
- <velocity>
- <logfile path="${artifacts.config.dir}/../rivermap_velocity.log"/>
- </velocity>
-
- <river name="Saar">
- <srid value="31467"/>
- <river-wms url="http://example.com/cgi-bin/river-wms" layers="Saar"/>
- <background-wms url="http://osm.intevation.de/mapcache/?" layers="flys-wms"/>
- </river>
- <river name="Mosel">
- <srid value="31467"/>
- <river-wms url="http://example.com/cgi-bin/river-wms" layers="Mosel"/>
- <background-wms url="http://osm.intevation.de/mapcache/?" layers="flys-wms"/>
- </river>
- <river name="Elbe">
- <srid value="31467"/>
- <river-wms url="http://example.com/cgi-bin/river-wms" layers="Elbe"/>
- <background-wms url="http://osm.intevation.de/mapcache/?" layers="flys-wms"/>
- </river>
-</rivermap>
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/confs/seddb-db.xml
--- a/contrib/make_flys_release/confs/seddb-db.xml Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,9 +0,0 @@
-<?xml version="1.0" encoding="UTF-8" ?>
-<seddb-database>
- <!-- This is the default SedDB db configuration. -->
- <user>seddb</user>
- <password>seddbpass</password>
- <dialect>org.hibernate.dialect.Oracle9iDialect</dialect>
- <driver>oracle.jdbc.driver.OracleDriver</driver>
- <url>jdbc:oracle:thin:@//czech-republic.atlas.intevation.de:1521/XE</url>
-</seddb-database>
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/h2/artifacts-h2.sql
--- a/contrib/make_flys_release/h2/artifacts-h2.sql Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
---
--- schema to store artifacts in H2 databases.
---
-
-BEGIN;
-
--- not using AUTO_INCREMENT to be more compatible with
--- other dbms.
-CREATE SEQUENCE ARTIFACTS_ID_SEQ;
-
-CREATE TABLE artifacts (
- id INT PRIMARY KEY NOT NULL,
- gid UUID NOT NULL UNIQUE,
- creation TIMESTAMP NOT NULL,
- last_access TIMESTAMP NOT NULL,
- ttl BIGINT, -- NULL means eternal
- factory VARCHAR(256) NOT NULL,
- data BINARY
-);
-
-CREATE SEQUENCE USERS_ID_SEQ;
-
-CREATE TABLE users (
- id INT PRIMARY KEY NOT NULL,
- gid UUID NOT NULL UNIQUE,
- name VARCHAR(256) NOT NULL,
- account VARCHAR(256) NOT NULL UNIQUE,
- role BINARY
-);
-
-CREATE SEQUENCE COLLECTIONS_ID_SEQ;
-
-CREATE TABLE collections (
- id INT PRIMARY KEY NOT NULL,
- gid UUID NOT NULL UNIQUE,
- name VARCHAR(256) NOT NULL,
- owner_id INT NOT NULL REFERENCES users(id),
- creation TIMESTAMP NOT NULL,
- last_access TIMESTAMP NOT NULL,
- ttl BIGINT, -- NULL means eternal
- attribute BINARY
-);
-
-CREATE SEQUENCE COLLECTION_ITEMS_ID_SEQ;
-
-CREATE TABLE collection_items (
- id INT PRIMARY KEY NOT NULL,
- collection_id INT NOT NULL REFERENCES collections(id),
- artifact_id INT NOT NULL REFERENCES artifacts(id),
- attribute BINARY,
- creation TIMESTAMP NOT NULL,
- UNIQUE (collection_id, artifact_id)
-);
-
-CREATE TRIGGER collections_access_update_trigger AFTER UPDATE
- ON artifacts FOR EACH ROW
- CALL "de.intevation.artifactdatabase.h2.CollectionAccessUpdateTrigger";
-
-COMMIT;
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/h2/createArtifacts.sh
--- a/contrib/make_flys_release/h2/createArtifacts.sh Wed Mar 06 14:14:15 2013 +0100
+++ b/contrib/make_flys_release/h2/createArtifacts.sh Fri Mar 22 11:25:54 2013 +0100
@@ -1,6 +1,6 @@
#!/bin/bash
-mkdir artifactsdb
+mkdir -p artifactsdb
DIR=`dirname $0`
DIR=`readlink -f "$DIR"`
@@ -12,6 +12,10 @@
export CLASSPATH
+if [ $# != 1 ]; then
+ echo "Usage: $0 <schema_file>"
+fi
+
java org.h2.tools.RunScript \
-url jdbc:h2:`readlink -f artifactsdb`/artifacts \
- -script $DIR/artifacts-h2.sql
+ -script "$1"
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/h2/createDatacage.sh
--- a/contrib/make_flys_release/h2/createDatacage.sh Wed Mar 06 14:14:15 2013 +0100
+++ b/contrib/make_flys_release/h2/createDatacage.sh Fri Mar 22 11:25:54 2013 +0100
@@ -1,6 +1,6 @@
#!/bin/bash
-mkdir datacagedb
+mkdir -p datacagedb
DIR=`dirname $0`
DIR=`readlink -f "$DIR"`
@@ -11,7 +11,10 @@
done
export CLASSPATH
+if [ $# != 1 ]; then
+ echo "Usage: $0 <schema_file>"
+fi
java org.h2.tools.RunScript \
-url jdbc:h2:`readlink -f datacagedb`/datacage \
- -script $DIR/datacage.sql
+ -script "$1"
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/h2/datacage.sql
--- a/contrib/make_flys_release/h2/datacage.sql Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,104 +0,0 @@
-BEGIN;
-
-CREATE SEQUENCE USERS_ID_SEQ;
-
-CREATE TABLE users (
- id INT PRIMARY KEY NOT NULL,
- gid UUID NOT NULL UNIQUE
-);
-
-CREATE SEQUENCE COLLECTIONS_ID_SEQ;
-
-CREATE TABLE collections (
- id INT PRIMARY KEY NOT NULL,
- gid UUID NOT NULL UNIQUE,
- user_id INT NOT NULL REFERENCES users(id) ON DELETE CASCADE,
- name VARCHAR(256) NOT NULL,
- creation TIMESTAMP NOT NULL
-);
-
-CREATE SEQUENCE ARTIFACTS_ID_SEQ;
-
-CREATE TABLE artifacts (
- id INT PRIMARY KEY NOT NULL,
- gid UUID NOT NULL UNIQUE,
- state VARCHAR(256) NOT NULL,
- creation TIMESTAMP NOT NULL
-);
-
-CREATE SEQUENCE COLLECTION_ITEMS_ID_SEQ;
-
-CREATE TABLE collection_items (
- id INT PRIMARY KEY NOT NULL,
- collection_id INT NOT NULL REFERENCES collections(id) ON DELETE CASCADE,
- artifact_id INT NOT NULL REFERENCES artifacts(id) ON DELETE CASCADE
-);
-
-CREATE SEQUENCE ARTIFACT_DATA_ID_SEQ;
-
-CREATE TABLE artifact_data (
- id INT PRIMARY KEY NOT NULL,
- artifact_id INT NOT NULL REFERENCES artifacts(id) ON DELETE CASCADE,
- kind VARCHAR(256) NOT NULL,
- k VARCHAR(256) NOT NULL,
- v VARCHAR(256), -- Maybe too short
- UNIQUE (artifact_id, k)
-);
-
-CREATE SEQUENCE OUTS_ID_SEQ;
-
-CREATE TABLE outs (
- id INT PRIMARY KEY NOT NULL,
- artifact_id INT NOT NULL REFERENCES artifacts(id) ON DELETE CASCADE,
- name VARCHAR(256) NOT NULL,
- description VARCHAR(256),
- out_type VARCHAR(256)
-);
-
-CREATE SEQUENCE FACETS_ID_SEQ;
-
-CREATE TABLE facets (
- id INT PRIMARY KEY NOT NULL,
- out_id INT NOT NULL REFERENCES outs(id) ON DELETE CASCADE,
- name VARCHAR(256) NOT NULL,
- num INT NOT NULL,
- state VARCHAR(256) NOT NULL,
- description VARCHAR(256),
- UNIQUE (out_id, num, name)
-);
-
-CREATE VIEW master_artifacts AS
- SELECT a2.id AS id,
- a2.gid AS gid,
- a2.state AS state,
- a2.creation AS creation,
- ci2.collection_id AS collection_id
- FROM collection_items ci2
- JOIN artifacts a2
- ON ci2.artifact_id = a2.id
- JOIN (SELECT ci.collection_id AS c_id,
- MIN(a.creation) AS oldest_a
- FROM collection_items ci
- JOIN artifacts a
- ON ci.artifact_id = a.id
- GROUP BY ci.collection_id) o
- ON o.c_id = ci2.collection_id
- WHERE a2.creation = o.oldest_a;
-
--- DROP VIEW master_artifacts;
--- DROP SEQUENCE USERS_ID_SEQ;
--- DROP SEQUENCE COLLECTIONS_ID_SEQ;
--- DROP SEQUENCE ARTIFACTS_ID_SEQ;
--- DROP SEQUENCE COLLECTION_ITEMS_ID_SEQ;
--- DROP SEQUENCE ARTIFACT_DATA_ID_SEQ;
--- DROP SEQUENCE OUTS_ID_SEQ;
--- DROP SEQUENCE FACETS_ID_SEQ;
--- DROP TABLE facets;
--- DROP TABLE outs;
--- DROP TABLE artifact_data;
--- DROP TABLE collection_items;
--- DROP TABLE collections;
--- DROP TABLE artifacts;
--- DROP TABLE users;
-
-COMMIT;
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/libs_oracle/ojdbc5.jar
Binary file contrib/make_flys_release/libs_oracle/ojdbc5.jar has changed
diff -r cfc5540a4eec -r 61bf64b102bc contrib/make_flys_release/make_release.sh
--- a/contrib/make_flys_release/make_release.sh Wed Mar 06 14:14:15 2013 +0100
+++ b/contrib/make_flys_release/make_release.sh Fri Mar 22 11:25:54 2013 +0100
@@ -1,143 +1,336 @@
#!/bin/bash
+# Release script for Flys
+#
+# Authors:
+# Andre Heinecke <aheinecke at intevation.de>
+#
+# Copyright:
+# Copyright (C) 2013 Intevation GmbH
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software
+# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
-echo "INFO: define required variables"
+set -e
+DEFAULT_WD=/tmp/flys-release
+DEVELOPER=aheinecke
+
ARTIFACTS_HG_REPO="http://wald.intevation.org/hg/dive4elements/artifacts"
HTTPCLIIENT_HG_REPO="http://wald.intevation.org/hg/dive4elements/http-client"
FLYS_HG_REPO="http://wald.intevation.org/hg/dive4elements/flys"
-ARTIFACTS_HG="hg.artifacts"
-HTTPCLIENT_HG="hg.http-client"
-FLYS_HG="hg.flys"
+REPOS="artifacts http-client flys"
+# Do not use spaces in path
+FLYS_SOURCE_DIR=/local-vol1/aheinecke/flys-release
-PREFIX="flys-"
-RELEASE=${RELEASE:-default}
-RELEASE_DATE=`date +'%Y-%m-%d'`
-DIRECTORY=$PREFIX$RELEASE-$RELEASE_DATE
+ORACLE_HIBERNATE=$FLYS_SOURCE_DIR/hibernate-spatial-oracle-1.1.jar
+ORACLE_JDBC=$FLYS_SOURCE_DIR/ojdbc5.jar
-ARTIFACT_PORT=${ARTIFACT_PORT:-9002}
-TOMCAT_PORT=${TOMCAT_PORT:-8005}
+SCRIPT_DIR=$(readlink -f `dirname $0`)
+usage(){
+ cat << EOF
-MAPSERVER_URL=${MAPSERVER_URL:-czech-republic.atlas.intevation.de}
+usage: $0 [options] VERSION
-echo "INFO: create server directories"
-mkdir -p $DIRECTORY/server/bin/lib/own
-mkdir $DIRECTORY/server/shapefiles
-mkdir $DIRECTORY/client
+Create a flys package
-echo "INFO: checkout sources"
-echo " ... checkout $ARTIFACTS_HG_REPO"
+OPTIONS:
+ -?, --help Show this message
+ -w The working directory to use. (do not use spaces in path)
+ Default: $DEFAULT_WD
+ -t Tag the current default branch as "VERSION"
+ -o, --oracle Release is for oracle.
+ VERSION must be in the format MAYOR.MINOR.PATCH
+EOF
+exit 0
+}
+# --backend-db-url Url of database backend. Default: $BACKENDURL
+# --backend-db-pass Backend db password. Default: $BACKENDPASS
+# --backend-db-port Backend db port. Default: $BACKENDPORT
+# --backend-db-user Backend db user. Default: $BACKENDUSER
+# --backend-db-backend Backend db backend name. Default: $BACKENDBACK
+# --seddb-url Sediment db url. Default: $SEDDBURL
+# --seddb-port Sediment db port. Default: $SEDDBPORT
+# --seddb-user Sediment db user. Default: $SEDDBUSER
+# --seddb-pass Sediment db password. Default: $SEDDBPASS
+# --seddb-back Sediment db backend. Default: $SEDDBBACK
+TOMCAT_PORT=${TOMCAT_PORT:-8282}
+MAPSERVER_URL=${MAPSERVER_URL:-flys-devel.intevation.de}
+WIKI_URL=${WIKI_URL:-https://flys-intern.intevation.de/Flys-3.0}
-rm -rf $ARTIFACTS_HG
-hg clone $ARTIFACTS_HG_REPO $ARTIFACTS_HG
-(cd $ARTIFACTS_HG && hg co $RELEASE)
+# Seddb Configuration
+SEDDBURL=${SEDDBURL:-czech-republic.atlas.intevation.de}
+SEDDBPORT=${SEDDBPORT:-1521}
+SEDDBBACK=${SEDDBBACK:-XE}
+SEDDBUSER=${SEDDBUSER:-seddb}
+SEDDBPASS=${SEDDBPASS:-seddbpass}
-echo " ... checkout $HTTPCLIIENT_HG_REPO"
-rm -rf $HTTPCLIENT_HG
-hg clone $HTTPCLIIENT_HG_REPO $HTTPCLIENT_HG
-(cd $HTTPCLIENT_HG && hg co $RELEASE)
+# Backend configuration
+BACKENDURL=${BACKENDURL:-czech-republic.atlas.intevation.de}
+BACKENDPORT=${BACKENDPORT:-5432}
+BACKENDBACK=${BACKENDBACK:-flys_2913}
+BACKENDUSER=${BACKENDUSER:-flys_dami}
+BACKENDPASS=${BACKENDPASS:-flys_dami}
+INITSQLS=${INITSQLS:-}
+LOG_DIR=/var/log/flys
-echo " ... checkout $FLYS_HG_REPO"
-rm -rf $FLYS_HG
-hg clone $FLYS_HG_REPO $FLYS_HG
-(cd $FLYS_HG && hg co $RELEASE)
+OPTS=`getopt -o ?w:,t,o \
+ -l help,oracle \
+ -n $0 -- "$@"`
-# adapt client configuration
-echo "INFO: prepare configuration of web client"
+if [ $? != 0 ] ; then usage; fi
+eval set -- "$OPTS"
+while true ; do
+ case "$1" in
+ "-?"|"--help")
+ usage;;
+ "--")
+ shift
+ break;;
+ "-w")
+ WORK_DIR=$2
+ shift 2;;
+ "-o"|"--oracle")
+ BUILD_ORACLE="TRUE"
+ shift;;
+ "-t")
+ DO_TAG="TRUE"
+ shift;;
+ *)
+ echo "Unknown Option $1"
+ usage;;
+ esac
+done
+if [ $# != 1 ]; then
+ usage
+fi
+
+VERSION=$1
+ARTIFACT_PORT=${ARTIFACT_PORT:-`echo 1$VERSION | sed 's/\.//g'`}
+
+if [ -z $WORK_DIR ]; then
+ WORK_DIR=$DEFAULT_WD
+fi
+
+mkdir -p $WORK_DIR
+
+if [ ! -d $FLYS_SOURCE_DIR ]; then
+ mkdir -p $FLYS_SOURCE_DIR
+ echo "Cloning sources"
+ cd $FLYS_SOURCE_DIR
+ hg clone $ARTIFACTS_HG_REPO artifacts
+ hg clone $HTTPCLIIENT_HG_REPO http-client
+ hg clone $FLYS_HG_REPO flys
+else
+ echo "Updating sources / Reverting changes"
+ cd $FLYS_SOURCE_DIR
+ for repo in $REPOS; do
+ cd $repo && hg pull && hg up && hg revert -a && cd $FLYS_SOURCE_DIR
+ done;
+fi
+
+if [ "$DO_TAG" = "TRUE" ]; then
+ echo "Tagging version $VERSION"
+ for repo in $REPOS; do
+ cd $repo
+ CHANGESET=$(hg log -l1 |head -1 | awk -F: '{print $3}')
+ echo ""
+ echo "Do you really want to tag $repo rev: $CHANGESET as Version $VERSION?"
+ echo "press enter to continue or CTRL+C to abort."
+ echo ""
+ hg log -l1
+ read
+ OLD_REV=$(cat .hgtags | tail -1 | awk '{print $2}')
+ hg tag $VERSION -m "Added tag $VERSION for changeset $CHANGESET"
+ hg push ssh://$DEVELOPER@scm.wald.intevation.org/hg/dive4elements/$repo
+ echo "Changelog for $repo" >> $WORK_DIR/changes_$OLD_REV-$VERSION.txt
+ echo "#############################################################################" \
+ >> $WORK_DIR/changes_$OLD_REV-$VERSION.txt
+ hg log -r $VERSION:$OLD_REV --style changelog >> $WORK_DIR/changes_$OLD_REV-$VERSION.txt
+ cd $FLYS_SOURCE_DIR
+ done;
+fi
+
+# Update to current version
+for repo in $REPOS; do
+ cd $repo
+ hg up $VERSION
+ cd $FLYS_SOURCE_DIR
+done
+
+rm -rf "$WORK_DIR/server" "$WORK_DIR/client"
+cd $WORK_DIR
+mkdir -p "$WORK_DIR/server/bin/lib/own"
+mkdir "$WORK_DIR/server/shapefiles"
+mkdir "$WORK_DIR/client"
+
+echo "[INFO]: Preparing configuration of web client"
+echo "[INFO]: Tomcat Port: $TOMCAT_PORT"
+echo "[INFO]: Artifact Port: $ARTIFACT_PORT"
sed -i -e "s at http://localhost:8181 at http://localhost:$ARTIFACT_PORT at g" \
-e "s at http://localhost:8888 at http://localhost:$TOMCAT_PORT at g" \
- $FLYS_HG/flys-client/src/main/webapp/WEB-INF/web.xml
+ $FLYS_SOURCE_DIR/flys/flys-client/src/main/webapp/WEB-INF/web.xml
-sed -i -e "s@/tmp/flys-client.log@/tmp/flys-client-${RELEASE}.log at g" \
- $FLYS_HG/flys-client/src/main/webapp/WEB-INF/log4j.properties
+sed -i -e "s@/tmp/flys-client.log@${LOG_DIR}/client-${VERSION}.log at g" \
+ $FLYS_SOURCE_DIR/flys/flys-client/src/main/webapp/WEB-INF/log4j.properties
-echo "INFO: download OpenLayers-2.11 for client"
-curl -O http://openlayers.org/download/OpenLayers-2.11.tar.gz
-tar xvfz OpenLayers-2.11.tar.gz
-# TODO: Remove more superfluous OpenLayers stuff.
-rm -rf OpenLayers-2.11/doc
-rm -rf OpenLayers-2.11/tests
-rm -rf OpenLayers-2.11/examples
-mv OpenLayers-2.11 $FLYS_HG/flys-client/src/main/webapp/
+find $FLYS_SOURCE_DIR/flys/flys-artifacts/src/main/resources/ -name messages\*.properties | \
+ xargs sed -i "s at https://flys-intern.intevation.de/Flys-3.0@${WIKI_URL}@g";
-# compile and build our code stuff
+find $FLYS_SOURCE_DIR/flys/ -name \*.properties -o -name \*.xsl | \
+ xargs sed -i "s at https://flys-intern.intevation.de@${WIKI_URL}@g";
+
+if [ ! -f $FLYS_SOURCE_DIR/OpenLayers-2.11.tar.gz ]; then
+ echo "INFO: download OpenLayers-2.11 for client"
+ cd $FLYS_SOURCE_DIR
+ curl -O http://openlayers.org/download/OpenLayers-2.11.tar.gz
+ tar xvfz OpenLayers-2.11.tar.gz
+ # TODO: Remove more superfluous OpenLayers stuff.
+ rm -rf OpenLayers-2.11/doc
+ rm -rf OpenLayers-2.11/tests
+ rm -rf OpenLayers-2.11/examples
+ cd $WORK_DIR
+fi
+cp -r $FLYS_SOURCE_DIR/OpenLayers-2.11 $FLYS_SOURCE_DIR/flys/flys-client/src/main/webapp/
+
echo "INFO: compile and build sources"
-mvn -f $ARTIFACTS_HG/pom.xml clean compile package install
-mvn -f $FLYS_HG/flys-backend/pom.xml clean compile package install
-mvn -f $FLYS_HG/flys-artifacts/pom.xml clean compile package dependency:copy-dependencies install
-mvn -f $HTTPCLIENT_HG/pom.xml clean compile package install
-mvn -f $FLYS_HG/flys-client/pom.xml clean compile package
+mvn -f $FLYS_SOURCE_DIR/artifacts/pom.xml clean compile package install
+mvn -f $FLYS_SOURCE_DIR/flys/flys-backend/pom.xml clean compile package install
+mvn -f $FLYS_SOURCE_DIR/flys/flys-artifacts/pom.xml clean compile package dependency:copy-dependencies install
+mvn -f $FLYS_SOURCE_DIR/http-client/pom.xml clean compile package install
+mvn -f $FLYS_SOURCE_DIR/flys/flys-client/pom.xml clean compile package
-## fetch the java stuff
+
echo "INFO: copy dependencies and libs"
-cp $ARTIFACTS_HG/artifact-database/target/artifact-database-1.0-SNAPSHOT.jar $DIRECTORY/server/bin/lib/own/
-cp $ARTIFACTS_HG/artifacts/target/artifacts-1.0-SNAPSHOT.jar $DIRECTORY/server/bin/lib/own/
-cp $ARTIFACTS_HG/artifacts-common/target/artifacts-common-1.0-SNAPSHOT.jar $DIRECTORY/server/bin/lib/own/
-cp $FLYS_HG/flys-backend/target/flys-backend-1.0-SNAPSHOT.jar $DIRECTORY/server/bin/lib/own/
-cp $FLYS_HG/flys-artifacts/target/flys-artifacts-1.0-SNAPSHOT.jar $DIRECTORY/server/bin/lib/own/
-cp $FLYS_HG/flys-client/target/FLYS-1.0-SNAPSHOT.war $DIRECTORY/client/flys-${RELEASE}.war
-cp $FLYS_HG/flys-artifacts/target/dependency/* $DIRECTORY/server/bin/lib/
+cp $FLYS_SOURCE_DIR/artifacts/artifact-database/target/artifact-database-1.0-SNAPSHOT.jar $WORK_DIR/server/bin/lib/own/
+cp $FLYS_SOURCE_DIR/artifacts/artifacts/target/artifacts-1.0-SNAPSHOT.jar $WORK_DIR/server/bin/lib/own/
+cp $FLYS_SOURCE_DIR/artifacts/artifacts-common/target/artifacts-common-1.0-SNAPSHOT.jar $WORK_DIR/server/bin/lib/own/
+cp $FLYS_SOURCE_DIR/flys/flys-backend/target/flys-backend-1.0-SNAPSHOT.jar $WORK_DIR/server/bin/lib/own/
+cp $FLYS_SOURCE_DIR/flys/flys-artifacts/target/flys-artifacts-1.0-SNAPSHOT.jar $WORK_DIR/server/bin/lib/own/
+cp $FLYS_SOURCE_DIR/flys/flys-client/target/FLYS-1.0-SNAPSHOT.war $WORK_DIR/client/flys-${VERSION}.war
+cp $FLYS_SOURCE_DIR/flys/flys-artifacts/target/dependency/* $WORK_DIR/server/bin/lib/
echo "INFO: copy scripts and libraries to target destination"
-cp bin/run.sh $DIRECTORY/server/bin/
-cp bin/wsplgen.exe $DIRECTORY/server/bin/
-cp libs/* $DIRECTORY/server/bin/lib/
+cp ${SCRIPT_DIR}/bin/run.sh $WORK_DIR/server/bin/
+cp ${SCRIPT_DIR}/bin/wsplgen.exe $WORK_DIR/server/bin/
+cp ${SCRIPT_DIR}/libs/* $WORK_DIR/server/bin/lib/
-#echo "INFO: remove PostgreSQL and PostGIS libraries"
-#rm $DIRECTORY/server/bin/lib/postg*
-#rm $DIRECTORY/server/bin/lib/hibernate-spatial-postgis*
-echo "INFO: remove Oralce libraries"
-rm -f $DIRECTORY/server/bin/lib/hibernate-spatial-oracle-1.1.jar
-rm -f $DIRECTORY/server/bin/lib/ojdbc*
+if [ "$BUILD_ORACLE" = "TRUE" ]; then
+ echo "INFO: remove PostgreSQL and PostGIS libraries"
+ rm $WORK_DIR/server/bin/lib/postg*
+ rm $WORK_DIR/server/bin/lib/hibernate-spatial-postgis*
+ if [ ! -f $ORACLE_JDBC ]; then
+ echo "Could not find oracle jdbc in $ORACLE_JDBC"
+ echo "Please make sure that the ORACLE_JDBC variable is set correctly"
+ exit 1
+ fi
+ if [ ! -f $ORACLE_HIBERNATE ]; then
+ echo "Could not find hibernate-spatial-oracle in $ORACLE_HIBERNATE"
+ echo "Please make sure that the ORACLE_HIBERNATE variable is set correctly"
+ exit 1
+ fi
+ cp $ORACLE_HIBERNATE $ORACLE_JDBC $WORK_DIR/server/bin/lib/
+else
+ echo "INFO: remove Oralce libraries"
+ rm -f $WORK_DIR/server/bin/lib/hibernate-spatial-oracle-1.1.jar
+ rm -f $WORK_DIR/server/bin/lib/ojdbc*
+fi
# fetch the configuration stuff
echo "INFO: copy default configuration to target destination"
-cp -R $FLYS_HG/flys-artifacts/doc/conf $DIRECTORY/server/
+cp -R $FLYS_SOURCE_DIR/flys/flys-artifacts/doc/conf $WORK_DIR/server/
-#cp confs/* $DIRECTORY/server/conf/
-mkdir -p $DIRECTORY/server/conf
+sed -i "s/8181/$ARTIFACT_PORT/g" \
+ $WORK_DIR/server/conf/rest-server.xml
-sed "s/8999/$ARTIFACT_PORT/g" \
- confs/rest-server.xml \
- > $DIRECTORY/server/conf/rest-server.xml
+sed -i -e "s at http://example.com/@http://${MAPSERVER_URL}/@g" \
+ $WORK_DIR/server/conf/floodmap.xml
-sed -e "s at http://example.com/@http://${MAPSERVER_URL}/@g" \
- confs/floodmap.xml \
- > $DIRECTORY/server/conf/floodmap.xml
+sed -i -e "s at http://example.com/@http://${MAPSERVER_URL}/@g" \
+ $WORK_DIR/server/conf/rivermap.xml
-sed -e "s at http://example.com/@http://${MAPSERVER_URL}/@g" \
- confs/rivermap.xml \
- > $DIRECTORY/server/conf/rivermap.xml
+sed -i -e "s@/tmp/flys-rivers-wms.log@${LOG_DIR}/rivers-wms-${VERSION}.log at g" \
+ $WORK_DIR/server/conf/mapserver/river-mapfile.vm
+sed -i -e "s@/tmp/flys-user-wms.log@${LOG_DIR}/user-wms-${VERSION}.log at g" \
+ $WORK_DIR/server/conf/mapserver/mapfile.vm
-sed "s@/tmp/flys-server-default.log@/tmp/flys-server-${RELEASE}.log" \
- confs/log4j.properties \
- > $DIRECTORY/server/conf/log4j.properties
+sed "s@/tmp/flys-server-default.log@${LOG_DIR}/server-${VERSION}.log@" \
+ $SCRIPT_DIR/confs/log4j.properties \
+ > $WORK_DIR/server/conf/log4j.properties
-# TODO: Use templating here
-cp confs/seddb-db.xml $DIRECTORY/server/conf/seddb-db.xml
-cp confs/backend-db.xml $DIRECTORY/server/conf/backend-db.xml
-cp confs/artifact-db.xml $DIRECTORY/server/conf/artifact-db.xml
-cp confs/datacage-db.xml $DIRECTORY/server/conf/datacage-db.xml
-cp confs/mapserver/fontset.txt $DIRECTORY/server/conf/mapserver/fontset.txt
+cat > $WORK_DIR/server/conf/seddb-db.xml << EOF
+<?xml version="1.0" encoding="UTF-8" ?>
+<seddb-database>
+ <!-- This is the default SedDB db configuration. -->
+ <user>$SEDDBUSER</user>
+ <password>$SEDDBPASS</password>
+ <dialect>org.hibernate.dialect.Oracle9iDialect</dialect>
+ <driver>oracle.jdbc.driver.OracleDriver</driver>
+ <url>jdbc:oracle:thin:@//$SEDDBURL:$SEDDBPORT/$SEDDBBACK </url>
+</seddb-database>
+EOF
-cp $ARTIFACTS_HG/artifact-database/doc/schema-h2.sql h2/artifacts-h2.sql
-cp $FLYS_HG/flys-artifacts/doc/conf/datacage.sql h2/datacage.sql
+if [ "$BUILD_ORACLE" = "TRUE" ]; then
+ # Oracle backend configuration
+ cat > $WORK_DIR/server/conf/backend-db.xml << EOF
+<?xml version="1.0" encoding="UTF-8" ?>
+<backend-database>
+ <user>$BACKENDUSER</user>
+ <password>$BACKENDPASS</password>
+ <dialect>org.hibernatespatial.oracle.OracleSpatial10gDialect</dialect>
+ <driver>oracle.jdbc.driver.OracleDriver</driver>
+ <url>jdbc:oracle:thin:@//$BACKENDURL:$BACKENDPORT/$BACKENDBACK</url>
+ <connection-init-sqls>$INITSQLS</connection-init-sqls>
+</backend-database>
+EOF
+else
+ #Postgresql backend configuration
+ cat > $WORK_DIR/server/conf/backend-db.xml << EOF
+<?xml version="1.0" encoding="UTF-8" ?>
+<backend-database>
+ <user>$BACKENDUSER</user>
+ <password>$BACKENDPASS</password>
+ <dialect>org.hibernate.dialect.PostgreSQLDialect</dialect>
+ <driver>org.postgresql.Driver</driver>
+ <url>jdbc:postgresql://$BACKENDURL:$BACKENDPORT/$BACKENDBACK</url>
+ <connection-init-sqls>$INITSQLS</connection-init-sqls>
+</backend-database>
+EOF
+fi
+
+mkdir $WORK_DIR/artifactsdb
+mkdir $WORK_DIR/datacagedb
+
+cp $FLYS_SOURCE_DIR/artifacts/artifact-database/doc/schema-h2.sql $WORK_DIR/artifactsdb/artifacts-h2.sql
+cp $FLYS_SOURCE_DIR/flys/flys-artifacts/doc/conf/datacage.sql $WORK_DIR/datacagedb/datacage.sql
echo "INFO: create h2 database for artifacts and datacage"
-h2/createArtifacts.sh
-h2/createDatacage.sh
+$SCRIPT_DIR/h2/createArtifacts.sh $WORK_DIR/artifactsdb/artifacts-h2.sql
+$SCRIPT_DIR/h2/createDatacage.sh $WORK_DIR/datacagedb/datacage.sql
-mv artifactsdb $DIRECTORY/server/
-mv datacagedb $DIRECTORY/server/
+mv $WORK_DIR/artifactsdb $WORK_DIR/server/
+mv $WORK_DIR/datacagedb $WORK_DIR/server/
echo "INFO: create tarball"
-tar cvfz $DIRECTORY.tar.gz $DIRECTORY
+mkdir $WORK_DIR/flys-$VERSION
+mv $WORK_DIR/server $WORK_DIR/client $WORK_DIR/flys-$VERSION
+cd $WORK_DIR
+tar cfz flys-$VERSION.tar.gz flys-$VERSION
+echo "INFO: cleanup"
+#rm -r $WORK_DIR/flys-$VERSION
-echo "INFO: remove temporary files and directories"
-rm -rf $ARTIFACTS_HG
-rm -rf $HTTPCLIENT_HG
-rm -rf $FLYS_HG
-rm -rf $DIRECTORY
-rm -rf OpenLayers.2.11.tar.gz
+echo "DONE: $WORK_DIR/flys-$VERSION.tar.gz"
+echo "Changelog: $WORK_DIR/changes_$OLD_REV-$VERSION.txt"
diff -r cfc5540a4eec -r 61bf64b102bc flys-aft/src/main/java/de/intevation/aft/DIPSGauge.java
--- a/flys-aft/src/main/java/de/intevation/aft/DIPSGauge.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-aft/src/main/java/de/intevation/aft/DIPSGauge.java Fri Mar 22 11:25:54 2013 +0100
@@ -113,7 +113,7 @@
String stationString = element.getAttribute("STATIONIERUNG");
if (stationString.length() == 0) {
log.warn("DIPS: Setting station of gauge '" + name + "' to zero.");
- stationString = "0";
+ stationString = "-99999";
}
station = Double.parseDouble(stationString);
if (station == 0d) {
diff -r cfc5540a4eec -r 61bf64b102bc flys-aft/src/main/java/de/intevation/aft/River.java
--- a/flys-aft/src/main/java/de/intevation/aft/River.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-aft/src/main/java/de/intevation/aft/River.java Fri Mar 22 11:25:54 2013 +0100
@@ -65,7 +65,8 @@
public boolean sync(SyncContext context) throws SQLException {
log.info("sync river: " + this);
- Map<Long, DIPSGauge> dipsGauges = context.getDIPSGauges();
+ // Only take relevant gauges into account.
+ Map<Long, DIPSGauge> dipsGauges = context.getDIPSGauges(name, from, to);
ConnectedStatements flysStatements = context.getFlysStatements();
ConnectedStatements aftStatements = context.getAftStatements();
@@ -78,14 +79,18 @@
.getStatement("select.messstelle")
.clearParameters()
.setInt("GEWAESSER_NR", id2)
- .setDouble("START_KM", from)
- .setDouble("END_KM", to)
.executeQuery();
try {
while (messstellenRs.next()) {
String name = messstellenRs.getString("NAME");
String num = messstellenRs.getString("MESSSTELLE_NR");
+ double station = messstellenRs.getDouble("STATIONIERUNG");
+
+ if (!messstellenRs.wasNull() && !inside(station)) {
+ log.warn("Station found in AFT but in not range: " + station);
+ continue;
+ }
Long number = SyncContext.numberToLong(num);
if (number == null) {
@@ -166,12 +171,93 @@
boolean modified = false;
for (DIPSGauge gauge: gauges) {
+ modified |= updateBfGIdOnMasterDischargeTable(context, gauge);
modified |= updateGauge(context, gauge);
}
return modified;
}
+ protected boolean updateBfGIdOnMasterDischargeTable(
+ SyncContext context,
+ DIPSGauge gauge
+ ) throws SQLException {
+ log.info(
+ "FLYS: Updating master discharge table bfg_id for '" +
+ gauge.getAftName() + "'");
+ ConnectedStatements flysStatements = context.getFlysStatements();
+
+ ResultSet rs = flysStatements
+ .getStatement("select.gauge.master.discharge.table")
+ .clearParameters()
+ .setInt("gauge_id", gauge.getFlysId())
+ .executeQuery();
+
+ int flysId;
+
+ try {
+ if (rs.next()) {
+ log.error(
+ "FLYS: No master discharge table found for gauge '" +
+ gauge.getAftName() + "'");
+ return false;
+ }
+ String bfgId = rs.getString("bfg_id");
+ if (!rs.wasNull()) { // already has BFG_ID
+ return false;
+ }
+ flysId = rs.getInt("id");
+ } finally {
+ rs.close();
+ }
+
+ // We need to find out the BFG_ID of the current discharge table
+ // for this gauge in AFT.
+
+ ConnectedStatements aftStatements = context.getAftStatements();
+
+ rs = aftStatements
+ .getStatement("select.bfg.id.current")
+ .clearParameters()
+ .setString("number", "%" + gauge.getOfficialNumber())
+ .executeQuery();
+
+ String bfgId = null;
+
+ try {
+ if (rs.next()) {
+ bfgId = rs.getString("BFG_ID");
+ }
+ } finally {
+ rs.close();
+ }
+
+ if (bfgId == null) {
+ log.warn(
+ "No BFG_ID found for current discharge table of gauge '" +
+ gauge + "'");
+ return false;
+ }
+
+ // Set the BFG_ID in FLYS.
+ flysStatements.beginTransaction();
+ try {
+ flysStatements
+ .getStatement("update.bfg.id.discharge.table")
+ .clearParameters()
+ .setInt("id", flysId)
+ .setString("bfg_id", bfgId)
+ .executeUpdate();
+ flysStatements.commitTransaction();
+ } catch (SQLException sqle) {
+ flysStatements.rollbackTransaction();
+ log.error(sqle, sqle);
+ return false;
+ }
+
+ return true;
+ }
+
protected boolean updateGauge(
SyncContext context,
DIPSGauge gauge
diff -r cfc5540a4eec -r 61bf64b102bc flys-aft/src/main/java/de/intevation/aft/SyncContext.java
--- a/flys-aft/src/main/java/de/intevation/aft/SyncContext.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-aft/src/main/java/de/intevation/aft/SyncContext.java Fri Mar 22 11:25:54 2013 +0100
@@ -87,6 +87,36 @@
return numberToGauge;
}
+ public Map<Long, DIPSGauge> getDIPSGauges(
+ String riverName,
+ double from,
+ double to
+ ) {
+ if (from > to) {
+ double t = from;
+ from = to;
+ to = t;
+ }
+
+ riverName = riverName.toLowerCase();
+
+ Map<Long, DIPSGauge> result = new HashMap<Long, DIPSGauge>();
+
+ for (Map.Entry<Long, DIPSGauge> entry: numberToGauge.entrySet()) {
+ DIPSGauge gauge = entry.getValue();
+ // XXX: Maybe a bit too sloppy.
+ if (!riverName.contains(gauge.getRiverName().toLowerCase())) {
+ continue;
+ }
+ double station = gauge.getStation();
+ if (station >= from && station <= to) {
+ result.put(entry.getKey(), gauge);
+ }
+ }
+
+ return result;
+ }
+
protected static Map<Long, DIPSGauge> indexByNumber(Document document) {
Map<Long, DIPSGauge> map = new HashMap<Long, DIPSGauge>();
NodeList nodes = document.getElementsByTagName("PEGELSTATION");
diff -r cfc5540a4eec -r 61bf64b102bc flys-aft/src/main/resources/sql/aft-common.properties
--- a/flys-aft/src/main/resources/sql/aft-common.properties Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-aft/src/main/resources/sql/aft-common.properties Fri Mar 22 11:25:54 2013 +0100
@@ -1,8 +1,9 @@
-select.gewaesser = SELECT GEWAESSER_NR, NAME FROM SL_GEWAESSER
+select.gewaesser = \
+ SELECT GEWAESSER_NR, NAME FROM SL_GEWAESSER
select.messstelle = \
- SELECT NAME, MESSSTELLE_NR \
+ SELECT NAME, MESSSTELLE_NR, STATIONIERUNG \
FROM MESSSTELLE \
- WHERE GEWAESSER_NR = :GEWAESSER_NR AND STATIONIERUNG BETWEEN :START_KM AND :END_KM
+ WHERE GEWAESSER_NR = :GEWAESSER_NR
select.abflusstafel = \
SELECT ABFLUSSTAFEL_NR, \
ABFLUSSTAFEL_BEZ, \
@@ -12,6 +13,13 @@
BFG_ID \
FROM ABFLUSSTAFEL \
WHERE MESSSTELLE_NR LIKE :number
-select.tafelwert = SELECT TAFELWERT_NR AS id, WASSERSTAND AS w, ABFLUSS AS q FROM TAFELWERT \
- WHERE ABFLUSSTAFEL_NR = :number
-
+select.tafelwert = \
+ SELECT TAFELWERT_NR AS id, WASSERSTAND AS w, ABFLUSS AS q FROM TAFELWERT \
+ WHERE ABFLUSSTAFEL_NR = :number
+select.bfg.id.current = \
+ SELECT BFG_ID AS BFG_ID FROM ABFLUSSTAFEL \
+ WHERE GUELTIG_VON IN ( \
+ SELECT min(GUELTIG_VON) FROM ABFLUSSTAFEL \
+ WHERE GUELTIG_VON IS NOT NULL AND GUELTIG_BIS IS NULL \
+ AND MESSSTELLE_NR LIKE :number) \
+ AND MESSSTELLE_NR :number
diff -r cfc5540a4eec -r 61bf64b102bc flys-aft/src/main/resources/sql/flys-common.properties
--- a/flys-aft/src/main/resources/sql/flys-common.properties Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-aft/src/main/resources/sql/flys-common.properties Fri Mar 22 11:25:54 2013 +0100
@@ -6,29 +6,52 @@
JOIN wst_column_values wcv ON wcv.wst_column_id = wc.id \
WHERE w.kind = 0 \
GROUP BY r.id, r.name
-select.gauges = SELECT id, name, official_number FROM gauges WHERE river_id = :river_id
-next.gauge.id = SELECT NEXTVAL('GAUGES_ID_SEQ') AS gauge_id
-insert.gauge = INSERT INTO gauges (id, name, river_id, station, aeo, official_number, datum) \
- VALUES(:id, :name, :river_id, :station, :aeo, :official_number, :datum)
-select.timeintervals = SELECT id, start_time, stop_time FROM time_intervals
-next.timeinterval.id = SELECT NEXTVAL('TIME_INTERVALS_ID_SEQ') AS time_interval_id
-insert.timeinterval = INSERT INTO time_intervals (id, start_time, stop_time) VALUES (:id, :start_time, :stop_time)
-next.discharge.id = SELECT NEXTVAL('DISCHARGE_TABLES_ID_SEQ') AS discharge_table_id
+select.gauges = \
+ SELECT id, name, official_number \
+ FROM gauges \
+ WHERE river_id = :river_id
+next.gauge.id = \
+ SELECT NEXTVAL('GAUGES_ID_SEQ') AS gauge_id
+insert.gauge = \
+ INSERT INTO gauges (id, name, river_id, station, aeo, official_number, datum) \
+ VALUES(:id, :name, :river_id, :station, :aeo, :official_number, :datum)
+select.timeintervals = \
+ SELECT id, start_time, stop_time FROM time_intervals
+next.timeinterval.id = \
+ SELECT NEXTVAL('TIME_INTERVALS_ID_SEQ') AS time_interval_id
+insert.timeinterval = \
+ INSERT INTO time_intervals (id, start_time, stop_time) \
+ VALUES (:id, :start_time, :stop_time)
+next.discharge.id = \
+ SELECT NEXTVAL('DISCHARGE_TABLES_ID_SEQ') AS discharge_table_id
insert.dischargetable = \
INSERT INTO discharge_tables \
(id, gauge_id, description, bfg_id, kind, time_interval_id) \
VALUES (:id, :gauge_id, :description, :bfg_id, 1, :time_interval_id)
-select.discharge.table.values = SELECT id, w, q FROM discharge_table_values WHERE table_id = :table_id
-next.discharge.table.values.id = SELECT NEXTVAL('DISCHARGE_TABLE_VALUES_ID_SEQ') AS discharge_table_values_id
-insert.discharge.table.value = INSERT INTO discharge_table_values (id, table_id, w, q) VALUES (:id, :table_id, :w, :q)
-delete.discharge.table.value = DELETE FROM discharge_table_values WHERE id = :id
+select.discharge.table.values = \
+ SELECT id, w, q FROM discharge_table_values WHERE table_id = :table_id
+next.discharge.table.values.id = \
+ SELECT NEXTVAL('DISCHARGE_TABLE_VALUES_ID_SEQ') AS discharge_table_values_id
+insert.discharge.table.value = \
+ INSERT INTO discharge_table_values (id, table_id, w, q) \
+ VALUES (:id, :table_id, :w, :q)
+delete.discharge.table.value = \
+ DELETE FROM discharge_table_values WHERE id = :id
select.gauge.discharge.tables = \
SELECT \
- dt.id AS id, \
+ dt.id AS id, \
dt.description AS description, \
- ti.start_time AS start_time, \
- ti.stop_time AS stop_time, \
- dt.bfg_id AS bfg_id \
+ ti.start_time AS start_time, \
+ ti.stop_time AS stop_time, \
+ dt.bfg_id AS bfg_id \
FROM discharge_tables dt \
LEFT OUTER JOIN time_intervals ti ON dt.time_interval_id = ti.id \
WHERE gauge_id = :gauge_id
+select.gauge.master.discharge.table = \
+ SELECT \
+ dt.id AS id, \
+ dt.bfg_id AS bfg_id \
+ FROM discharge_tables dt JOIN gauges g ON dt.gauge_id = g.id \
+ WHERE g.id = :gauge_id AND g.kind = 0
+update.bfg.id.discharge.table = \
+ UPDATE discharge_tables SET bfg_id = :bfg_id WHERE id = :id
diff -r cfc5540a4eec -r 61bf64b102bc flys-aft/src/main/resources/sql/flys-oracle-jdbc-oracledriver.properties
--- a/flys-aft/src/main/resources/sql/flys-oracle-jdbc-oracledriver.properties Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-aft/src/main/resources/sql/flys-oracle-jdbc-oracledriver.properties Fri Mar 22 11:25:54 2013 +0100
@@ -1,5 +1,8 @@
-next.gauge.id = SELECT GAUGES_ID_SEQ.NEXTVAL AS gauge_id FROM DUAL
-next.timeinterval.id = SELECT TIME_INTERVALS_ID_SEQ.NEXTVAL AS time_interval_id FROM DUAL
-next.discharge.id = SELECT DISCHARGE_TABLES_ID_SEQ.NEXTVAL AS discharge_table_id FROM DUAL
-next.discharge.table.values.id = SELECT DISCHARGE_TABLE_VALUES_ID_SEQ.NEXTVAL AS discharge_table_values_id FROM DUAL
-
+next.gauge.id = \
+ SELECT GAUGES_ID_SEQ.NEXTVAL AS gauge_id FROM DUAL
+next.timeinterval.id = \
+ SELECT TIME_INTERVALS_ID_SEQ.NEXTVAL AS time_interval_id FROM DUAL
+next.discharge.id = \
+ SELECT DISCHARGE_TABLES_ID_SEQ.NEXTVAL AS discharge_table_id FROM DUAL
+next.discharge.table.values.id = \
+ SELECT DISCHARGE_TABLE_VALUES_ID_SEQ.NEXTVAL AS discharge_table_values_id FROM DUAL
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/artifact-db.xml
--- a/flys-artifacts/doc/conf/artifact-db.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/artifact-db.xml Fri Mar 22 11:25:54 2013 +0100
@@ -6,5 +6,5 @@
<password></password>
<!-- For use with a postgresql database use the appropriate driver-->
<!--driver>org.postgresql.Driver</driver-->
- <url>jdbc:h2:${artifacts.config.dir}/../artifactdb/artifacts.db</url>
+ <url>jdbc:h2:${artifacts.config.dir}/../artifactsdb/artifacts</url>
</database>
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/artifacts/fixanalysis.xml
--- a/flys-artifacts/doc/conf/artifacts/fixanalysis.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/artifacts/fixanalysis.xml Fri Mar 22 11:25:54 2013 +0100
@@ -270,6 +270,7 @@
<facet name="w_differences.manualpoints" description="Manuelle Punkte"/>
<facet name="longitudinal_section.manualpoints" description="Manuelle Punkte"/>
<facet name="longitudinal_section.annotations" description="facet.longitudinal_section.annotations"/>
+ <facet name="longitudinal_section.area" description="facet.longitudinal_section.area"/>
</facets>
</outputmode>
<outputmode name="fix_wq_curve" description="output.fix_wq_curve" mine-type="image/png" type="chart">
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/artifacts/map.xml
--- a/flys-artifacts/doc/conf/artifacts/map.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/artifacts/map.xml Fri Mar 22 11:25:54 2013 +0100
@@ -20,7 +20,8 @@
<facet name="floodmap.wmsbackground"/>
<facet name="floodmap.kms"/>
<facet name="floodmap.qps"/>
- <facet name="floodmap.hws"/>
+ <facet name="floodmap.hws_lines"/>
+ <facet name="floodmap.hws_points"/>
<facet name="floodmap.hydr_boundaries"/>
<facet name="floodmap.hydr_boundaries_poly"/>
<facet name="floodmap.catchment"/>
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/artifacts/winfo.xml
--- a/flys-artifacts/doc/conf/artifacts/winfo.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/artifacts/winfo.xml Fri Mar 22 11:25:54 2013 +0100
@@ -402,6 +402,8 @@
<facets>
<facet name="discharge_longitudinal_section.w"/>
<facet name="discharge_longitudinal_section.q"/>
+ <facet name="discharge_longitudinal_section.q.infolding"/>
+ <facet name="discharge_longitudinal_section.q.cutting"/>
<facet name="discharge_longitudinal_section.c"/>
<facet name="discharge_longitudinal_section.manualpoints" description="Manuelle Punkte"/>
<facet name="other.wqkms.w"/>
@@ -494,12 +496,51 @@
<state id="state.winfo.uesk.scenario" description="state.winfo.uesk.scenario" state="de.intevation.flys.artifacts.states.ScenarioSelect" helpText="help.state.winfo.uesk.scenario">
<data name="scenario" type="String" />
+ </state>
+
+ <transition transition="de.intevation.flys.artifacts.transitions.ValueCompareTransition">
+ <from state="state.winfo.uesk.scenario"/>
+ <to state="state.winfo.uesk.uesk"/>
+ <condition data="scenario" value="scenario.current" operator="equal"/>
+ </transition>
+
+ <transition transition="de.intevation.flys.artifacts.transitions.ValueCompareTransition">
+ <from state="state.winfo.uesk.scenario"/>
+ <to state="state.winfo.uesk.uesk"/>
+ <condition data="scenario" value="scenario.potentiel" operator="equal"/>
+ </transition>
+
+ <transition transition="de.intevation.flys.artifacts.transitions.ValueCompareTransition">
+ <from state="state.winfo.uesk.scenario"/>
+ <to state="state.winfo.uesk.dc-hws"/>
+ <condition data="scenario" value="scenario.scenario" operator="equal"/>
+ </transition>
+
+ <state id="state.winfo.uesk.dc-hws" description="state.winfo.uesk.dc-hws" state="de.intevation.flys.artifacts.states.HWSDatacageState" helpText="help.state.winfo.uesk.dc-hws">
+ <data name="uesk.hws" type="String" />
+ </state>
+
+ <transition transition="de.intevation.flys.artifacts.transitions.DefaultTransition">
+ <from state="state.winfo.uesk.dc-hws" />
+ <to state="state.winfo.uesk.user-rgd" />
+ </transition>
+
+ <state id="state.winfo.uesk.user-rgd" description="state.winfo.uesk.user-rgd" state="de.intevation.flys.artifacts.states.UserRGDState" helpText="help.state.winfo.uesk.user-rgd">
+ <data name="uesk.user-rgd" type="String" />
+ </state>
+
+ <transition transition="de.intevation.flys.artifacts.transitions.DefaultTransition">
+ <from state="state.winfo.uesk.user-rgd" />
+ <to state="state.winfo.uesk.barriers" />
+ </transition>
+
+ <state id="state.winfo.uesk.barriers" description="state.winfo.uesk.barriers" state="de.intevation.flys.artifacts.states.HWSBarriersState" helpText="help.state.winfo.uesk.barriers">
<data name="uesk.barriers" type="String" />
</state>
<transition transition="de.intevation.flys.artifacts.transitions.DefaultTransition">
- <from state="state.winfo.uesk.scenario"/>
- <to state="state.winfo.uesk.uesk"/>
+ <from state="state.winfo.uesk.barriers" />
+ <to state="state.winfo.uesk.uesk" />
</transition>
<state id="state.winfo.uesk.uesk" description="state.winfo.uesk.uesk" state="de.intevation.flys.artifacts.states.FloodMapState" helpText="help.state.winfo.uesk.uesk">
@@ -513,7 +554,8 @@
<facet name="floodmap.wmsbackground"/>
<facet name="floodmap.kms"/>
<facet name="floodmap.qps"/>
- <facet name="floodmap.hws"/>
+ <facet name="floodmap.hws_lines"/>
+ <facet name="floodmap.hws_points"/>
<facet name="floodmap.hydr_boundaries"/>
<facet name="floodmap.hydr_boundaries_poly"/>
<facet name="floodmap.catchment"/>
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/conf.xml
--- a/flys-artifacts/doc/conf/conf.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/conf.xml Fri Mar 22 11:25:54 2013 +0100
@@ -39,24 +39,15 @@
<artifact-factory name="wmsqpsfactory" description="Factory to create an artifact that generates WMS facets for CrossSectionTracks."
ttl="3600000"
artifact="de.intevation.flys.artifacts.WMSQPSArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
- <artifact-factory name="wmshwsfactory" description="Factory to create an artifact that generates WMS facets for CrossSectionTracks."
- ttl="3600000"
- artifact="de.intevation.flys.artifacts.WMSHwsArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
<artifact-factory name="wmshydrboundariesfactory" description="Factory to create an artifact that generates WMS facets for CrossSectionTracks."
ttl="3600000"
artifact="de.intevation.flys.artifacts.WMSHydrBoundaryArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
<artifact-factory name="wmshydrboundariespolyfactory" description="Factory to create an artifact that generates WMS facets for CrossSectionTracks."
ttl="3600000"
artifact="de.intevation.flys.artifacts.WMSHydrBoundaryPolyArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
- <artifact-factory name="wmscatchmentfactory" description="Factory to create an artifact that generates WMS facets for CrossSectionTracks."
- ttl="3600000"
- artifact="de.intevation.flys.artifacts.WMSCatchmentArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
<artifact-factory name="wmsfloodplainfactory" description="Factory to create an artifact that generates WMS facets for CrossSectionTracks."
ttl="3600000"
artifact="de.intevation.flys.artifacts.WMSFloodplainArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
- <artifact-factory name="wmslinefactory" description="Factory to create an artifact to be used in WINFO"
- ttl="3600000"
- artifact="de.intevation.flys.artifacts.WMSLineArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
<artifact-factory name="wmsbuildingsfactory" description="Factory to create an artifact to be used in WINFO"
ttl="3600000"
artifact="de.intevation.flys.artifacts.WMSBuildingsArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
@@ -114,6 +105,12 @@
<artifact-factory name="qsectors" description="Factory to create an artifact to host qsectors."
ttl="3600000"
artifact="de.intevation.flys.artifacts.QSectorArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
+ <artifact-factory name="wmshwslinesfactory" description="Factory to create an artifact that generates WMS facets for HWS Lines"
+ ttl="3600000"
+ artifact="de.intevation.flys.artifacts.WMSHWSLinesArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
+ <artifact-factory name="wmshwspointsfactory" description="Factory to create an artifact that generates WMS facets for HWS Points"
+ ttl="3600000"
+ artifact="de.intevation.flys.artifacts.WMSHWSPointsArtifact">de.intevation.artifactdatabase.DefaultArtifactFactory</artifact-factory>
<!-- MINFO specific Artifacts -->
<artifact-factory name="minfo" description="Factory to create an artifact to be used in module minfo."
@@ -403,9 +400,14 @@
<zoom-scale river="Elbe" range="100" radius="5" />
<zoom-scale river="Elbe" range="500" radius="10" />
</zoom-scales>
+
<minfo-sq>
<!-- valid names: grubbs or std-dev -->
<outlier-method name="grubbs"/>
</minfo-sq>
+
+ <dgm-path>
+ /path/to/rivers/
+ </dgm-path>
</options>
</artifact-database>
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/datacage-db.xml
--- a/flys-artifacts/doc/conf/datacage-db.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/datacage-db.xml Fri Mar 22 11:25:54 2013 +0100
@@ -3,5 +3,5 @@
<user>SA</user>
<password/>
<driver>org.h2.Driver</driver>
- <url>jdbc:h2:${artifacts.config.dir}/../h2/datacage</url>
+ <url>jdbc:h2:${artifacts.config.dir}/../datacagedb/datacage</url>
</datacage>
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/floodmap.xml
--- a/flys-artifacts/doc/conf/floodmap.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/floodmap.xml Fri Mar 22 11:25:54 2013 +0100
@@ -21,13 +21,121 @@
<river name="Mosel">
<srid value="31467"/>
<dgm-srid value="31466"/>
- <river-wms url="http://example.com/cgi-bin/user-wms" layers="Mosel"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Mosel"/>
<background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
</river>
<river name="Elbe">
<srid value="31467"/>
<dgm-srid value="31467"/>
- <river-wms url="http://example.com/cgi-bin/elbe-wms" layers="Elbe"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Elbe"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Donau">
+ <srid value="31467"/>
+ <dgm-srid value="25833"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Donau"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Donaurna">
+ <srid value="31467"/>
+ <dgm-srid value="25833"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Donaurna"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="DonauSK">
+ <srid value="31467"/>
+ <dgm-srid value="25833"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="DonauSK"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Fulda">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Fulda"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Fulda-Sommer">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Fulda-Sommer"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Lahn">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Lahn"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Main">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Main"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Main-Wehrarm-Limbach">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Main-Wehrarm-Limbach"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Main-Wehrarm-Volkach">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Main-Wehrarm-Volkach"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Neckar">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Neckar"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Neckar-über-Wehrarme">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Neckar-über-Wehrarme"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Rhein">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Rhein"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Saale">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Saale"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Saale-Thüringen">
+ <srid value="31467"/>
+ <dgm-srid value="31468"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Saale-Thüringen"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Saar-Wilt-Bogen">
+ <srid value="31467"/>
+ <dgm-srid value="31466"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Saar-Wilt-Bogen"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Werra">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Werra"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Werra-Sommer">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Werra-Sommer"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Weser">
+ <srid value="31467"/>
+ <dgm-srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Weser"/>
<background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
</river>
</floodmap>
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/mapserver/river-mapfile.vm
--- a/flys-artifacts/doc/conf/mapserver/river-mapfile.vm Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/mapserver/river-mapfile.vm Fri Mar 22 11:25:54 2013 +0100
@@ -14,7 +14,7 @@
END
DEBUG 3
- CONFIG "MS_ERRORFILE" "log/rivers.log"
+ CONFIG "MS_ERRORFILE" "/tmp/flys-rivers-wms.log"
WEB
METADATA
@@ -57,4 +57,4 @@
#foreach ($LAYER in $LAYERS)
include "$LAYER"
#end
-END
\ No newline at end of file
+END
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/meta-data.xml
--- a/flys-artifacts/doc/conf/meta-data.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/meta-data.xml Fri Mar 22 11:25:54 2013 +0100
@@ -1,7 +1,10 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<dc:template xmlns:dc="http://www.intevation.org/2011/Datacage">
<datacage>
-<dc:macro name="SQL-wst_columns_statement">
+ <dc:comment>
+ Statement to load data from wsts.
+ </dc:comment>
+ <dc:macro name="SQL-wst_columns_statement">
<dc:statement>
SELECT wst_columns.id AS prot_column_id,
wst_columns.name AS prot_column_name,
@@ -10,10 +13,49 @@
wst_ranges.a AS deffrom,
wst_ranges.b AS defto
FROM wst_columns, wst_ranges
- WHERE wst_columns.wst_id = ${prot_id} AND wst_ranges.wst_column_id = wst_columns.id
+ WHERE wst_columns.wst_id = ${prot_id}
+ AND wst_ranges.wst_column_id = wst_columns.id
+ AND (${fromkm} BETWEEN wst_ranges.a AND wst_ranges.b
+ OR ${tokm} BETWEEN wst_ranges.a AND wst_ranges.b
+ OR wst_ranges.a BETWEEN ${fromkm} AND ${tokm}
+ OR wst_ranges.b BETWEEN ${fromkm} AND ${tokm})
ORDER by wst_columns.position
</dc:statement>
-</dc:macro>
+ </dc:macro>
+
+ <dc:comment>
+ Load user specific distance information from artifact.
+ </dc:comment>
+ <dc:macro name="user-range">
+ <dc:choose>
+ <dc:when test="dc:contains($parameters, 'user-id')">
+ <dc:context connection="user">
+ <dc:statement>
+ SELECT COALESCE(ld_mode, '') AS ldm,
+ COALESCE(ld_locations, '') AS ldl,
+ COALESCE(ld_from, '') AS ldf,
+ COALESCE(ld_to, '') AS ldt
+ FROM master_artifacts_range
+ WHERE gid = CAST(${artifact-id} as uuid)
+ </dc:statement>
+ <dc:elements>
+ <dc:variable name="fromkm" type="number" expr="dc:fromValue($ldm, $ldl, $ldf)"/>
+ <dc:variable name="tokm" type="number" expr="dc:toValue($ldm, $ldl, $ldt)"/>
+ <dc:macro-body/>
+ </dc:elements>
+ </dc:context>
+ </dc:when>
+ <dc:otherwise>
+ <dc:variable name="fromkm" type="number" expr="dc:fromValue('', '', '')"/>
+ <dc:variable name="tokm" type="number" expr="dc:toValue('', '', '')"/>
+ <dc:macro-body/>
+ </dc:otherwise>
+ </dc:choose>
+ </dc:macro>
+
+ <dc:comment>
+ System part. Load data for the given river.
+ </dc:comment>
<dc:macro name="load-system">
<dc:context connection="system">
<dc:statement>
@@ -21,13 +63,15 @@
WHERE lower(name) LIKE lower(${river})
</dc:statement>
<dc:elements>
+
<dc:comment>
- Base-data macros (mostly data imported from wst-files)
+ Base-data macros (mostly data imported from wst-files).
</dc:comment>
<dc:macro name="basedata_0">
+ <dc:call-macro name="user-range">
<dc:comment comment=" BASEDATA ---------------------------"/>
<basedata>
- <dc:context>
+ <dc:context connection="system">
<dc:statement>
SELECT id AS prot_id,
description AS prot_description
@@ -51,11 +95,14 @@
</dc:elements>
</dc:context>
</basedata>
+ </dc:call-macro>
</dc:macro>
+
<dc:macro name="basedata_0_wq">
+ <dc:call-macro name="user-range">
<dc:comment comment=" BASEDATA ---------------------------"/>
<basedata>
- <dc:context>
+ <dc:context connection="system">
<dc:statement>
SELECT id AS prot_id,
description AS prot_description
@@ -79,12 +126,14 @@
</dc:elements>
</dc:context>
</basedata>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_1_additionals_marks">
+ <dc:call-macro name="user-range">
<dc:comment comment=".ZUS -------------------------------"/>
<additionals>
- <dc:context>
+ <dc:context connection="system">
<dc:statement>
SELECT id AS prot_id,
description AS prot_description
@@ -108,12 +157,14 @@
</dc:elements>
</dc:context>
</additionals>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_1_additionals">
+ <dc:call-macro name="user-range">
<dc:comment comment=".ZUS -------------------------------"/>
<additionals>
- <dc:context>
+ <dc:context connection="system">
<dc:statement>
SELECT id AS prot_id,
description AS prot_description
@@ -137,12 +188,14 @@
</dc:elements>
</dc:context>
</additionals>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_1_additionals-relative_point">
+ <dc:call-macro name="user-range">
<dc:comment comment=".ZUS -------------------------------"/>
<additionals>
- <dc:context>
+ <dc:context connection="system">
<dc:statement>
SELECT id AS prot_id,
description AS prot_description
@@ -166,125 +219,135 @@
</dc:elements>
</dc:context>
</additionals>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_2_fixations_wst">
- <fixations>
- <dc:context>
- <dc:statement>
- SELECT id AS prot_id,
- description AS prot_description
- FROM wsts WHERE kind = 2 AND river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <fixation>
- <dc:attribute name="name" value="${prot_description}"/>
- <!--dc:attribute name="ids" value="fixations-wstv-A-${prot_id}"/-->
- <dc:context>
- <dc:call-macro name="SQL-wst_columns_statement"/>
- <dc:elements>
- <column>
- <dc:attribute name="name" value="${prot_column_name}"/>
- <dc:attribute name="ids" value="fixations-wstv-${prot_rel_pos}-${prot_id}"/>
- <dc:attribute name="factory" value="wqinterpol"/>
- <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
- </column>
- </dc:elements>
- </dc:context>
- </fixation>
- </dc:elements>
- </dc:context>
- </fixations>
+ <dc:call-macro name="user-range">
+ <fixations>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT id AS prot_id,
+ description AS prot_description
+ FROM wsts WHERE kind = 2 AND river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <fixation>
+ <dc:attribute name="name" value="${prot_description}"/>
+ <!--dc:attribute name="ids" value="fixations-wstv-A-${prot_id}"/-->
+ <dc:context>
+ <dc:call-macro name="SQL-wst_columns_statement"/>
+ <dc:elements>
+ <column>
+ <dc:attribute name="name" value="${prot_column_name}"/>
+ <dc:attribute name="ids" value="fixations-wstv-${prot_rel_pos}-${prot_id}"/>
+ <dc:attribute name="factory" value="wqinterpol"/>
+ <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
+ </column>
+ </dc:elements>
+ </dc:context>
+ </fixation>
+ </dc:elements>
+ </dc:context>
+ </fixations>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_2_fixations_wqkms">
- <fixations>
- <dc:context>
- <dc:statement>
- SELECT id AS prot_id,
- description AS prot_description
- FROM wsts WHERE kind = 2 AND river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <fixation>
- <dc:attribute name="name" value="${prot_description}"/>
- <dc:context>
- <dc:call-macro name="SQL-wst_columns_statement"/>
- <dc:elements>
- <column>
- <dc:attribute name="name" value="${prot_column_name}"/>
- <dc:attribute name="ids" value="fixations-wstv-${prot_rel_pos}-${prot_id}"/>
- <dc:attribute name="factory" value="wqinterpol"/>
- <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
- </column>
- </dc:elements>
- </dc:context>
- </fixation>
- </dc:elements>
- </dc:context>
- </fixations>
+ <dc:call-macro name="user-range">
+ <fixations>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT id AS prot_id,
+ description AS prot_description
+ FROM wsts WHERE kind = 2 AND river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <fixation>
+ <dc:attribute name="name" value="${prot_description}"/>
+ <dc:context>
+ <dc:call-macro name="SQL-wst_columns_statement"/>
+ <dc:elements>
+ <column>
+ <dc:attribute name="name" value="${prot_column_name}"/>
+ <dc:attribute name="ids" value="fixations-wstv-${prot_rel_pos}-${prot_id}"/>
+ <dc:attribute name="factory" value="wqinterpol"/>
+ <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
+ </column>
+ </dc:elements>
+ </dc:context>
+ </fixation>
+ </dc:elements>
+ </dc:context>
+ </fixations>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_2_fixations">
- <fixations>
- <dc:context>
- <dc:statement>
- SELECT id AS prot_id,
- description AS prot_description
- FROM wsts WHERE kind = 2 AND river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <fixation>
- <dc:attribute name="name" value="${prot_description}"/>
- <dc:context>
- <dc:call-macro name="SQL-wst_columns_statement"/>
- <dc:elements>
- <column>
- <dc:attribute name="name" value="${prot_column_name}"/>
- <dc:attribute name="ids" value="fixations-wstv-${prot_rel_pos}-${prot_id}"/>
- <dc:attribute name="factory" value="staticwkms"/>
- <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
- </column>
- </dc:elements>
- </dc:context>
- </fixation>
- </dc:elements>
- </dc:context>
- </fixations>
+ <dc:call-macro name="user-range">
+ <fixations>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT id AS prot_id,
+ description AS prot_description
+ FROM wsts WHERE kind = 2 AND river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <fixation>
+ <dc:attribute name="name" value="${prot_description}"/>
+ <dc:context>
+ <dc:call-macro name="SQL-wst_columns_statement"/>
+ <dc:elements>
+ <column>
+ <dc:attribute name="name" value="${prot_column_name}"/>
+ <dc:attribute name="ids" value="fixations-wstv-${prot_rel_pos}-${prot_id}"/>
+ <dc:attribute name="factory" value="staticwkms"/>
+ <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
+ </column>
+ </dc:elements>
+ </dc:context>
+ </fixation>
+ </dc:elements>
+ </dc:context>
+ </fixations>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_2_fixations_relative_point">
- <fixations>
- <dc:context>
- <dc:statement>
- SELECT id AS prot_id,
- description AS prot_description
- FROM wsts WHERE kind = 2 AND river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <relativepoint>
- <dc:attribute name="name" value="${prot_description}"/>
- <dc:context>
- <dc:call-macro name="SQL-wst_columns_statement"/>
- <dc:elements>
- <column>
- <dc:attribute name="name" value="${prot_column_name}"/>
- <dc:attribute name="ids" value="fixations-wstv-${prot_rel_pos}-${prot_id}"/>
- <dc:attribute name="factory" value="staticwkms"/>
- <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
- </column>
- </dc:elements>
- </dc:context>
- </relativepoint>
- </dc:elements>
- </dc:context>
- </fixations>
+ <dc:call-macro name="user-range">
+ <fixations>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT id AS prot_id,
+ description AS prot_description
+ FROM wsts WHERE kind = 2 AND river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <relativepoint>
+ <dc:attribute name="name" value="${prot_description}"/>
+ <dc:context>
+ <dc:call-macro name="SQL-wst_columns_statement"/>
+ <dc:elements>
+ <column>
+ <dc:attribute name="name" value="${prot_column_name}"/>
+ <dc:attribute name="ids" value="fixations-wstv-${prot_rel_pos}-${prot_id}"/>
+ <dc:attribute name="factory" value="staticwkms"/>
+ <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
+ </column>
+ </dc:elements>
+ </dc:context>
+ </relativepoint>
+ </dc:elements>
+ </dc:context>
+ </fixations>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_3_officials">
+ <dc:call-macro name="user-range">
<dc:comment comment=".wst -------------------------------"/>
<officiallines>
- <dc:context>
+ <dc:context connection="system">
<dc:statement>
SELECT id AS prot_id,
description AS prot_description
@@ -308,107 +371,149 @@
</dc:elements>
</dc:context>
</officiallines>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_4_heightmarks-points-relative_points">
- <heightmarks>
- <dc:context>
- <dc:statement>
- SELECT id AS prot_id,
- description AS prot_description
- FROM wsts WHERE kind = 4 AND river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <relativepoint>
- <dc:attribute name="name" value="${prot_description}"/>
- <dc:context>
- <dc:call-macro name="SQL-wst_columns_statement"/>
- <dc:elements>
- <column>
- <dc:attribute name="name" value="${prot_column_name}"/>
- <dc:attribute name="ids" value="heightmarks_points-wstv-${prot_rel_pos}-${prot_id}"/>
- <dc:attribute name="factory" value="staticwkms"/>
- <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
- </column>
- </dc:elements>
- </dc:context>
- </relativepoint>
- </dc:elements>
- </dc:context>
- </heightmarks>
+ <dc:call-macro name="user-range">
+ <heightmarks>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT id AS prot_id,
+ description AS prot_description
+ FROM wsts WHERE kind = 4 AND river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <relativepoint>
+ <dc:attribute name="name" value="${prot_description}"/>
+ <dc:context>
+ <dc:call-macro name="SQL-wst_columns_statement"/>
+ <dc:elements>
+ <column>
+ <dc:attribute name="name" value="${prot_column_name}"/>
+ <dc:attribute name="ids" value="heightmarks_points-wstv-${prot_rel_pos}-${prot_id}"/>
+ <dc:attribute name="factory" value="staticwkms"/>
+ <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
+ </column>
+ </dc:elements>
+ </dc:context>
+ </relativepoint>
+ </dc:elements>
+ </dc:context>
+ </heightmarks>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_4_heightmarks-points">
- <heightmarks>
- <dc:context>
- <dc:statement>
- SELECT id AS prot_id,
- description AS prot_description
- FROM wsts WHERE kind = 4 AND river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <heightmark>
- <dc:attribute name="name" value="${prot_description}"/>
- <dc:context>
- <dc:call-macro name="SQL-wst_columns_statement"/>
- <dc:elements>
- <column>
- <dc:attribute name="name" value="${prot_column_name}"/>
- <dc:attribute name="ids" value="heightmarks_points-wstv-${prot_rel_pos}-${prot_id}"/>
- <dc:attribute name="factory" value="staticwkms"/>
- <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
- </column>
- </dc:elements>
- </dc:context>
- </heightmark>
- </dc:elements>
- </dc:context>
- </heightmarks>
+ <dc:call-macro name="user-range">
+ <heightmarks>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT id AS prot_id,
+ description AS prot_description
+ FROM wsts WHERE kind = 4 AND river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <heightmark>
+ <dc:attribute name="name" value="${prot_description}"/>
+ <dc:context>
+ <dc:call-macro name="SQL-wst_columns_statement"/>
+ <dc:elements>
+ <column>
+ <dc:attribute name="name" value="${prot_column_name}"/>
+ <dc:attribute name="ids" value="heightmarks_points-wstv-${prot_rel_pos}-${prot_id}"/>
+ <dc:attribute name="factory" value="staticwkms"/>
+ <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
+ </column>
+ </dc:elements>
+ </dc:context>
+ </heightmark>
+ </dc:elements>
+ </dc:context>
+ </heightmarks>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_4_heightmarks-wq">
- <heightmarks>
- <dc:context>
- <dc:statement>
- SELECT id AS prot_id,
- description AS prot_description
- FROM wsts WHERE kind = 4 AND river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <heightmark>
- <dc:attribute name="name" value="${prot_description}"/>
- <dc:context>
- <dc:call-macro name="SQL-wst_columns_statement"/>
- <dc:elements>
- <column>
- <dc:attribute name="name" value="${prot_column_name}"/>
- <dc:attribute name="ids" value="heightmarks_annotations-wstv-${prot_rel_pos}-${prot_id}"/>
- <dc:attribute name="factory" value="wqinterpol"/>
- <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
- </column>
- </dc:elements>
- </dc:context>
- </heightmark>
- </dc:elements>
- </dc:context>
- </heightmarks>
+ <dc:call-macro name="user-range">
+ <heightmarks>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT id AS prot_id,
+ description AS prot_description
+ FROM wsts WHERE kind = 4 AND river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <heightmark>
+ <dc:attribute name="name" value="${prot_description}"/>
+ <dc:context>
+ <dc:call-macro name="SQL-wst_columns_statement"/>
+ <dc:elements>
+ <column>
+ <dc:attribute name="name" value="${prot_column_name}"/>
+ <dc:attribute name="ids" value="heightmarks_annotations-wstv-${prot_rel_pos}-${prot_id}"/>
+ <dc:attribute name="factory" value="wqinterpol"/>
+ <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
+ </column>
+ </dc:elements>
+ </dc:context>
+ </heightmark>
+ </dc:elements>
+ </dc:context>
+ </heightmarks>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="basedata_5_flood-protections_relative_points">
- <flood_protections>
- <dc:attribute name="id" value="flood-protections-${river_id}"/>
- <dc:context>
- <dc:statement>
- SELECT id AS prot_id,
- description AS prot_description
- FROM wsts WHERE kind = 5 AND river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <relativepoint>
- <dc:attribute name="name" value="${prot_description}"/>
- <dc:attribute name="db-id" value="${prot_id}"/>
- <dc:attribute name="factory" value="staticwkms"/>
- <columns>
+ <dc:call-macro name="user-range">
+ <flood_protections>
+ <dc:attribute name="id" value="flood-protections-${river_id}"/>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT id AS prot_id,
+ description AS prot_description
+ FROM wsts WHERE kind = 5 AND river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <relativepoint>
+ <dc:attribute name="name" value="${prot_description}"/>
+ <dc:attribute name="db-id" value="${prot_id}"/>
+ <dc:attribute name="factory" value="staticwkms"/>
+ <columns>
+ <dc:context>
+ <dc:call-macro name="SQL-wst_columns_statement"/>
+ <dc:elements>
+ <column>
+ <dc:attribute name="name" value="${prot_column_name}"/>
+ <dc:attribute name="ids" value="flood_protection-wstv-${prot_rel_pos}-${prot_id}"/>
+ <dc:attribute name="factory" value="staticwkms"/>
+ <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
+ </column>
+ </dc:elements>
+ </dc:context>
+ </columns>
+ </relativepoint>
+ </dc:elements>
+ </dc:context>
+ </flood_protections>
+ </dc:call-macro>
+ </dc:macro>
+
+ <dc:macro name="basedata_5_flood-protections">
+ <dc:call-macro name="user-range">
+ <flood_protections>
+ <dc:attribute name="id" value="flood-protections-${river_id}"/>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT id AS prot_id,
+ description AS prot_description
+ FROM wsts WHERE kind = 5 AND river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <flood_protection>
+ <dc:attribute name="name" value="${prot_description}"/>
+ <dc:attribute name="db-id" value="${prot_id}"/>
+ <dc:attribute name="factory" value="staticwkms"/>
<dc:context>
<dc:call-macro name="SQL-wst_columns_statement"/>
<dc:elements>
@@ -420,42 +525,11 @@
</column>
</dc:elements>
</dc:context>
- </columns>
- </relativepoint>
- </dc:elements>
- </dc:context>
- </flood_protections>
- </dc:macro>
-
- <dc:macro name="basedata_5_flood-protections">
- <flood_protections>
- <dc:attribute name="id" value="flood-protections-${river_id}"/>
- <dc:context>
- <dc:statement>
- SELECT id AS prot_id,
- description AS prot_description
- FROM wsts WHERE kind = 5 AND river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <flood_protection>
- <dc:attribute name="name" value="${prot_description}"/>
- <dc:attribute name="db-id" value="${prot_id}"/>
- <dc:attribute name="factory" value="staticwkms"/>
- <dc:context>
- <dc:call-macro name="SQL-wst_columns_statement"/>
- <dc:elements>
- <column>
- <dc:attribute name="name" value="${prot_column_name}"/>
- <dc:attribute name="ids" value="flood_protection-wstv-${prot_rel_pos}-${prot_id}"/>
- <dc:attribute name="factory" value="staticwkms"/>
- <dc:attribute name="info" value="${info} [km ${deffrom} - ${defto}]"/>
- </column>
- </dc:elements>
- </dc:context>
- </flood_protection>
- </dc:elements>
- </dc:context>
- </flood_protections>
+ </flood_protection>
+ </dc:elements>
+ </dc:context>
+ </flood_protections>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="mainvalues">
@@ -512,97 +586,129 @@
</dc:macro>
<dc:macro name="cross_sections">
- <cross_sections>
- <dc:attribute name="id" value="flood-protections-${river_id}"/>
- <dc:context>
- <dc:statement>
- SELECT id AS prot_id,
- description AS prot_description
- FROM cross_sections WHERE river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <cross_section>
- <dc:attribute name="name" value="${prot_description}"/>
- <dc:attribute name="ids" value="${prot_id}"/>
- <dc:attribute name="factory" value="crosssections"/>
- </cross_section>
- </dc:elements>
- </dc:context>
- </cross_sections>
+ <dc:call-macro name="user-range">
+ <cross_sections>
+ <dc:attribute name="id" value="flood-protections-${river_id}"/>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT DISTINCT
+ cs.id AS prot_id,
+ cs.description AS prot_description
+ FROM cross_sections cs
+ JOIN cross_section_lines csl ON csl.cross_section_id = cs.id
+ WHERE cs.river_id = ${river_id}
+ AND csl.km BETWEEN ${fromkm} AND ${tokm}
+ </dc:statement>
+ <dc:elements>
+ <cross_section>
+ <dc:attribute name="name" value="${prot_description}"/>
+ <dc:attribute name="ids" value="${prot_id}"/>
+ <dc:attribute name="factory" value="crosssections"/>
+ </cross_section>
+ </dc:elements>
+ </dc:context>
+ </cross_sections>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="hyks">
- <hyks>
- <dc:attribute name="id" value="hyk-${river_id}"/>
- <dc:context>
- <dc:statement>
- SELECT id AS hyk_id,
- description AS hyk_description
- FROM hyks WHERE river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <hyk>
- <dc:attribute name="name" value="${hyk_description}"/>
- <dc:attribute name="ids" value="${hyk_id}"/>
- <dc:attribute name="factory" value="hyk"/>
- </hyk>
- </dc:elements>
- </dc:context>
- </hyks>
+ <dc:call-macro name="user-range">
+ <hyks>
+ <dc:attribute name="id" value="hyk-${river_id}"/>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT DISTINCT
+ h.id AS hyk_id,
+ h.description AS hyk_description
+ FROM hyks h
+ JOIN hyk_entries he ON he.hyk_id = h.id
+ WHERE river_id = ${river_id}
+ AND he.km BETWEEN ${fromkm} AND ${tokm}
+ </dc:statement>
+ <dc:elements>
+ <hyk>
+ <dc:attribute name="name" value="${hyk_description}"/>
+ <dc:attribute name="ids" value="${hyk_id}"/>
+ <dc:attribute name="factory" value="hyk"/>
+ </hyk>
+ </dc:elements>
+ </dc:context>
+ </hyks>
+ </dc:call-macro>
</dc:macro>
<dc:macro name="flow_velocity_measurements">
- <flowvelocitymeasurement>
- <dc:context>
- <dc:statement>
- SELECT id AS fvmid,
- description AS fvmd
- FROM flow_velocity_measurements WHERE river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <flow_velocity_measurement>
- <dc:attribute name="name" value="${fvmd}"/>
- <dc:attribute name="ids" value="${fvmid}"/>
- <dc:attribute name="factory" value="flowvelocity"/>
- <dc:context>
- <dc:statement>
- SELECT id, description, station, datetime, v, w, q
- FROM flow_velocity_measure_values
- WHERE measurements_id = ${fvmid}
- </dc:statement>
- <dc:elements>
- <measurement_value>
- <dc:attribute name="name" value="${id}-${description}-${station}-${datetime}"/>
- <dc:attribute name="ids" value="${id}"/>
- <dc:attribute name="factory" value="flowvelocity"/>
- </measurement_value>
- </dc:elements>
- </dc:context>
- </flow_velocity_measurement>
+ <dc:call-macro name="user-range">
+ <flowvelocitymeasurement>
+ <dc:context connection="system">
+ <dc:statement>
+ SELECT id AS fvmid,
+ description AS fvmd
+ FROM flow_velocity_measurements WHERE river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <flow_velocity_measurement>
+ <dc:attribute name="name" value="${fvmd}"/>
+ <dc:attribute name="ids" value="${fvmid}"/>
+ <dc:attribute name="factory" value="flowvelocity"/>
+ <dc:context>
+ <dc:statement>
+ SELECT id, description, station, datetime, v, w, q
+ FROM flow_velocity_measure_values
+ WHERE measurements_id = ${fvmid}
+ AND station BETWEEN ${fromkm} AND ${tokm}
+ </dc:statement>
+ <dc:elements>
+ <measurement_value>
+ <dc:attribute name="name" value="${id}-${description}-${station}-${datetime}"/>
+ <dc:attribute name="ids" value="${id}"/>
+ <dc:attribute name="factory" value="flowvelocity"/>
+ </measurement_value>
+ </dc:elements>
+ </dc:context>
+ </flow_velocity_measurement>
</dc:elements>
</dc:context>
</flowvelocitymeasurement>
+ </dc:call-macro>
+ </dc:macro>
+
+ <dc:macro name="sounding-width">
+ <soundings_width>
+ <dc:context>
+ <dc:statement>
+ SELECT id AS bedh_id,
+ year AS bedh_year,
+ description AS bedh_descr
+ FROM bed_height_single WHERE river_id = ${river_id}
+ </dc:statement>
+ <dc:elements>
+ <height>
+ <dc:attribute name="factory" value="bedheight"/>
+ <dc:attribute name="ids" value="bedheight-singlevalues-${bedh_id}-${bedh_year}"/>
+ <dc:attribute name="description" value="${bedh_descr}"/>
+ </height>
+ </dc:elements>
+ </dc:context>
+ </soundings_width>
</dc:macro>
<dc:macro name="longitudinal-section-prototype">
- <dc:call-macro name="basedata_0"/>
- <dc:call-macro name="basedata_1_additionals"/>
- <dc:comment comment=" FIXATIONS ---------------------------"/>
- <dc:call-macro name="basedata_2_fixations"/>
- <dc:comment comment=" HOEHENMARKEN ---------------------------"/>
- <dc:call-macro name="basedata_4_heightmarks-points"/>
- <dc:comment comment=" AMTL LINIEN ---------------------------"/>
- <dc:call-macro name="basedata_3_officials"/>
- <dc:call-macro name="basedata_5_flood-protections"/>
- <dc:call-macro name="annotations_per_type"/>
+ <dc:call-macro name="basedata_0"/>
+ <dc:call-macro name="basedata_1_additionals"/>
+ <dc:comment comment=" FIXATIONS ---------------------------"/>
+ <dc:call-macro name="basedata_2_fixations"/>
+ <dc:comment comment=" HOEHENMARKEN ---------------------------"/>
+ <dc:call-macro name="basedata_4_heightmarks-points"/>
+ <dc:comment comment=" AMTL LINIEN ---------------------------"/>
+ <dc:call-macro name="basedata_3_officials"/>
+ <dc:call-macro name="basedata_5_flood-protections"/>
+ <dc:call-macro name="annotations_per_type"/>
</dc:macro>
<dc:comment>
-
+ River-Node
-
</dc:comment>
-
<river>
<dc:attribute name="name" value="${river_name}"/>
@@ -631,6 +737,9 @@
<dc:if test="dc:contains($artifact-outs, 'fix_wq_curve')">
<dc:call-macro name="qsectors"/>
</dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'longitudinal_section')">
+ <dc:call-macro name="annotations"/>
+ </dc:if>
<dc:if test="dc:contains($artifact-outs, 'fix_longitudinal_section_curve')">
<dc:call-macro name="annotations"/>
</dc:if>
@@ -709,23 +818,7 @@
MINFO bedheight middle
</dc:comment>
<dc:if test="dc:contains($artifact-outs, 'bedheight_middle')">
- <soundings_width>
- <dc:context>
- <dc:statement>
- SELECT id AS bedh_id,
- year AS bedh_year,
- description AS bedh_descr
- FROM bed_height_single WHERE river_id = ${river_id}
- </dc:statement>
- <dc:elements>
- <height>
- <dc:attribute name="factory" value="bedheight"/>
- <dc:attribute name="ids" value="bedheight-singlevalues-${bedh_id}-${bedh_year}"/>
- <dc:attribute name="description" value="${bedh_descr}"/>
- </height>
- </dc:elements>
- </dc:context>
- </soundings_width>
+ <dc:call-macro name="sounding-width"/>
</dc:if>
<dc:comment comment="--- non-recommendations---"/>
</dc:otherwise>
@@ -844,7 +937,7 @@
</discharge_table_nn>
</dc:if>
- <dc:if test="dc:contains($artifact-outs, 'floodmap')">
+ <dc:if test="dc:contains($artifact-outs, 'floodmap') or dc:contains($artifact-outs, 'floodmap-hws')">
<floodmap>
<dc:choose>
<dc:when test="dc:contains($parameters, 'recommended')">
@@ -858,6 +951,13 @@
</dc:otherwise>
</dc:choose>
</floodmap>
+ <dc:if test="dc:contains($parameters, 'hws')">
+ <hws>
+ <dc:call-macro name="flood-map-hws-lines" />
+ <dc:call-macro name="flood-map-hws-points" />
+ </hws>
+ </dc:if>
+
<dc:macro name="flood-map-recommended">
<dc:comment>
FIXME: Following two macros look identical to me.
@@ -879,12 +979,15 @@
<dems>
<dc:context>
<dc:statement>
- SELECT id AS dem_id,
- lower AS dem_lower,
- upper AS dem_upper,
- name AS name,
- projection || ' | ' || year_from || ' - ' || year_to AS info
- FROM dem WHERE river_id = ${river_id}
+ SELECT d.id AS dem_id,
+ r.a AS dem_lower,
+ r.b AS dem_upper,
+ d.name AS name,
+ d.projection || ' | ' || t.start_time || ' - ' || t.stop_time AS info
+ FROM dem d
+ JOIN ranges r ON d.range_id = r.id
+ JOIN time_intervals t ON d.time_interval_id = t.id
+ WHERE d.river_id = ${river_id}
</dc:statement>
<dc:elements>
<dem>
@@ -897,6 +1000,138 @@
</dc:context>
</dems>
</dc:macro>
+ <dc:macro name="flood-map-hws-lines">
+ <dc:context>
+ <dc:statement>
+ SELECT DISTINCT
+ name AS hws_name,
+ official AS hws_official,
+ kind_id AS hws_kind
+ FROM hws_lines
+ WHERE river_id = ${river_id}
+ </dc:statement>
+ <lines>
+ <official>
+ <Durchlass>
+ <dc:elements filter="$hws_kind=1 and $hws_official=1">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_name}"/>
+ </hws>
+ </dc:elements>
+ </Durchlass>
+ <Damm>
+ <dc:elements filter="$hws_kind=2 and $hws_official=1">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_name}"/>
+ </hws>
+ </dc:elements>
+ </Damm>
+ <Graben>
+ <dc:elements filter="$hws_kind=3 and $hws_official=1">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_name}"/>
+ </hws>
+ </dc:elements>
+ </Graben>
+ </official>
+ <inofficial>
+ <Durchlass>
+ <dc:elements filter="$hws_kind=1 and $hws_official=0">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_name}"/>
+ </hws>
+ </dc:elements>
+ </Durchlass>
+ <Damm>
+ <dc:elements filter="$hws_kind=2 and $hws_official=0">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_name}"/>
+ </hws>
+ </dc:elements>
+ </Damm>
+ <Graben>
+ <dc:elements filter="$hws_kind=3 and $hws_official=0">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_name}"/>
+ </hws>
+ </dc:elements>
+ </Graben>
+ </inofficial>
+ </lines>
+ </dc:context>
+ </dc:macro>
+ <dc:macro name="flood-map-hws-points">
+ <dc:context>
+ <dc:statement>
+ SELECT DISTINCT
+ name AS hws_points_name,
+ official AS hws_points_official,
+ kind_id AS hws_points_kind
+ FROM hws_points
+ WHERE river_id = ${river_id}
+ </dc:statement>
+ <points>
+ <official>
+ <Durchlass>
+ <dc:elements filter="$hws_points_kind=1 and $hws_points_official=1">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_points_name}"/>
+ </hws>
+ </dc:elements>
+ </Durchlass>
+ <Damm>
+ <dc:elements filter="$hws_points_kind=2 and $hws_points_official=1">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_points_name}"/>
+ </hws>
+ </dc:elements>
+ </Damm>
+ <Graben>
+ <dc:elements filter="$hws_kind=3 and $hws_official=1">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_points_name}"/>
+ </hws>
+ </dc:elements>
+ </Graben>
+ </official>
+ <inofficial>
+ <Durchlass>
+ <dc:elements filter="$hws_points_kind=1 and $hws_points_official=0">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_points_name}"/>
+ </hws>
+ </dc:elements>
+ </Durchlass>
+ <Damm>
+ <dc:elements filter="$hws_points_kind=2 and $hws_points_official=0">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_points_name}"/>
+ </hws>
+ </dc:elements>
+ </Damm>
+ <Graben>
+ <dc:elements filter="$hws_points_kind=3 and $hws_points_official=0">
+ <hws>
+ <dc:attribute name="factory" value="hwsfactory"/>
+ <dc:attribute name="name" value="${hws_points_name}"/>
+ </hws>
+ </dc:elements>
+ </Graben>
+ </inofficial>
+ </points>
+ </dc:context>
+ </dc:macro>
<dc:macro name="flood-map-km">
<dc:context>
<dc:statement>
@@ -929,23 +1164,6 @@
</dc:elements>
</dc:context>
</dc:macro>
- <dc:macro name="flood-map-hws">
- <dc:context>
- <dc:statement>
- SELECT count(*) as km_exists, name as name
- FROM hws WHERE river_id = ${river_id} GROUP BY name
- </dc:statement>
- <dc:elements>
- <dc:if test="$km_exists>0">
- <hws>
- <dc:attribute name="factory" value="wmshwsfactory"/>
- <dc:attribute name="ids" value="${river_id};${name}"/>
- <dc:attribute name="name" value="${name}"/>
- </hws>
- </dc:if>
- </dc:elements>
- </dc:context>
- </dc:macro>
<dc:macro name="flood-map-hydr-boundaries">
<hydr_boundaries_lines>
<dc:call-macro name="flood-map-hydr-boundaries-lines"/>
@@ -1026,23 +1244,6 @@
</dc:context>
</land>
</dc:macro>
- <dc:macro name="flood-map-catchments">
- <dc:context>
- <dc:statement>
- SELECT count(*) as km_exists, name as name
- FROM catchment WHERE river_id = ${river_id} GROUP BY name
- </dc:statement>
- <dc:elements>
- <dc:if test="$km_exists>0">
- <catchment>
- <dc:attribute name="factory" value="wmscatchmentfactory"/>
- <dc:attribute name="ids" value="${river_id};${name}"/>
- <dc:attribute name="name" value="${name}"/>
- </catchment>
- </dc:if>
- </dc:elements>
- </dc:context>
- </dc:macro>
<dc:macro name="flood-map-floodplain">
<dc:context>
<dc:statement>
@@ -1059,16 +1260,33 @@
</dc:elements>
</dc:context>
</dc:macro>
- <dc:macro name="flood-map-lines">
+ <dc:macro name="hwslines">
<dc:context>
<dc:statement>
SELECT count(*) as km_exists, name as name
- FROM lines WHERE river_id = ${river_id} GROUP BY name
+ FROM hws_lines WHERE river_id = ${river_id} GROUP BY name
</dc:statement>
<dc:elements>
<dc:if test="$km_exists>0">
<line>
- <dc:attribute name="factory" value="wmslinefactory"/>
+ <dc:attribute name="factory" value="wmshwslinesfactory"/>
+ <dc:attribute name="ids" value="${river_id};${name}"/>
+ <dc:attribute name="name" value="${name}"/>
+ </line>
+ </dc:if>
+ </dc:elements>
+ </dc:context>
+ </dc:macro>
+ <dc:macro name="hwspoints">
+ <dc:context>
+ <dc:statement>
+ SELECT count(*) as km_exists, name as name
+ FROM hws_points WHERE river_id = ${river_id} GROUP BY name
+ </dc:statement>
+ <dc:elements>
+ <dc:if test="$km_exists>0">
+ <line>
+ <dc:attribute name="factory" value="wmshwspointsfactory"/>
<dc:attribute name="ids" value="${river_id};${name}"/>
<dc:attribute name="name" value="${name}"/>
</line>
@@ -1223,9 +1441,6 @@
<fixpoints>
<dc:call-macro name="flood-map-fixpoints"/>
</fixpoints>
- <hws>
- <dc:call-macro name="flood-map-hws"/>
- </hws>
<hydrboundaries>
<dc:call-macro name="flood-map-hydr-boundaries"/>
<dc:call-macro name="flood-map-floodplain"/>
@@ -1238,9 +1453,14 @@
<dc:call-macro name="flood-map-km"/>
<dc:call-macro name="flood-map-qps"/>
</kilometrage>
- <lines>
- <dc:call-macro name="flood-map-lines"/>
- </lines>
+ <hws>
+ <hws_lines>
+ <dc:call-macro name="hwslines"/>
+ </hws_lines>
+ <hws_points>
+ <dc:call-macro name="hwspoints"/>
+ </hws_points>
+ </hws>
<dc:call-macro name="flood-map-uesk"/>
<gaugelocations>
<dc:call-macro name="flood-map-gaugelocations"/>
@@ -1318,18 +1538,45 @@
<dc:when test="dc:contains($parameters, 'user-id')">
- <old_calculations>
- <dc:context connection="user">
- <dc:comment>
- Get the user and collection-id.
- </dc:comment>
- <dc:statement>
+ <old_calculations>
+ <!-- <dc:macro name="load-user">-->
+ <dc:call-macro name="user-range">
+ <dc:context connection="user">
+ <dc:comment>
+ Get the user and collection-id.
+ </dc:comment>
+ <dc:statement>
SELECT u.id AS user_id, c.id AS collection_id, c.name as collection_name
FROM collections c JOIN users u ON c.user_id = u.id
WHERE u.gid = CAST(${user-id} AS uuid)
ORDER BY c.creation DESC
</dc:statement>
+
+ <dc:macro name="range-filter">
+ <dc:statement>
+ SELECT m.id AS a_id,
+ m.state AS a_state,
+ m.gid AS a_gid,
+ m.creation AS a_creation,
+ COALESCE(ld_mode, '') AS ld_m,
+ COALESCE(ld_locations, '') AS ld_l,
+ COALESCE(ld_from, '') AS ld_f,
+ COALESCE(ld_to, '') AS ld_t
+ FROM master_artifacts_range m
+ WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
+ AND EXISTS (
+ SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
+ </dc:statement>
+ <dc:elements>
+ <dc:variable name="from" type="number" expr="dc:fromValue($ld_m, $ld_l, $ld_f)"/>
+ <dc:variable name="to" type="number" expr="dc:toValue($ld_m, $ld_l, $ld_t)"/>
+ <dc:if test="($from >= $fromkm and $from <= $tokm) or ($to <= $tokm and $to >= $fromkm) or ($from <= $fromkm and $to >= $tokm)">
+ <dc:macro-body/>
+ </dc:if>
+ </dc:elements>
+ </dc:macro>
+
<!-- OFFICIAL LINES -->
<dc:if test="dc:contains($artifact-outs, 'longitudinal_section')">
<dc:comment comment=".wst -------------------------------"/>
@@ -1337,11 +1584,27 @@
<dc:elements>
<dc:context>
<dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation, ardg.v as gaugy, arv.v as wqsingle
- FROM master_artifacts m, artifact_data ardg, artifact_data arv
- WHERE m.collection_id = ${collection_id} AND m.gid = CAST(${artifact-id} AS uuid) AND ardg.artifact_id = m.id AND ardg.k = 'ld_gaugename' AND arv.artifact_id = m.id AND arv.k = 'wq_single'
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
+ SELECT m.id AS a_id,
+ m.state AS a_state,
+ m.gid AS a_gid,
+ m.creation AS a_creation,
+ ardg.v AS gaugy,
+ arv.v AS wqsingle
+ FROM master_artifacts m,
+ artifact_data ardg,
+ artifact_data arv
+ WHERE m.collection_id = ${collection_id}
+ AND m.gid = CAST(${artifact-id} AS uuid)
+ AND ardg.artifact_id = m.id
+ AND ardg.k = 'ld_gaugename'
+ AND arv.artifact_id = m.id
+ AND arv.k = 'wq_single'
+ AND EXISTS (
+ SELECT id
+ FROM artifact_data ad
+ WHERE ad.artifact_id = m.id
+ AND k = 'river'
+ AND v = ${river})
</dc:statement>
<dc:elements>
<dc:context connection="system">
@@ -1369,18 +1632,11 @@
SHOW W-DIFFERENCES
</dc:comment>
- <dc:if test="dc:contains($artifact-outs, 'longitudinal_section') or (dc:contains($artifact-outs, 'w_differences') or (dc:contains($artifact-outs, 'discharge_longitudinal_section')))">
+ <dc:macro name="differences">
<differences>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT a.gid as aid, f.id AS fid, f.name AS facet_name, f.num AS facet_num, f.description as facet_description
@@ -1397,28 +1653,22 @@
</dc:element>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</differences>
- </dc:if>
+ </dc:macro>
<dc:comment>
SHOW REFERENCE CURVE
</dc:comment>
- <dc:if test="dc:contains($artifact-outs, 'reference_curve')">
+
+ <dc:macro name="reference-curves">
<reference_curves>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
+ <dc:call-macro name="user-range">
<dc:context>
<dc:statement>
SELECT a.gid as aid, f.id AS fid, f.name AS facet_name, f.num AS facet_num, f.description as facet_description
@@ -1435,28 +1685,21 @@
</dc:element>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</reference_curves>
- </dc:if>
+ </dc:macro>
<dc:comment>
SHOW COMPUTED DISCHARGE CURVES
</dc:comment>
- <dc:if test="dc:contains($artifact-outs, 'computed_discharge_curve')">
+ <dc:macro name="computed-discharge-curve">
<computed_discharge_curves>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT a.gid as aid, f.id AS fid, f.name AS facet_name, f.num AS facet_num, f.description as facet_description
@@ -1473,43 +1716,37 @@
</dc:element>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</computed_discharge_curves>
- </dc:if>
+ </dc:macro>
<dc:comment>
CROSS SECTION
</dc:comment>
- <dc:if test="dc:contains($artifact-outs, 'cross_section')">
+
+ <dc:macro name="waterlevels">
<waterlevels>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
- <dc:context>
- <dc:statement>
- SELECT id AS out_id
- FROM outs
- WHERE artifact_id = ${a_id} AND name = 'cross_section'
- </dc:statement>
- <dc:elements>
- <dc:context>
- <dc:statement>
- SELECT name AS facet_name, num as facet_num, description AS facet_description
- FROM facets
- WHERE out_id = ${out_id}
- ORDER BY num ASC, name DESC
- </dc:statement>
- <longitudinal_section_columns>
+ <dc:call-macro name="range-filter">
+ <dc:context>
+ <dc:statement>
+ SELECT id AS out_id
+ FROM outs
+ WHERE artifact_id = ${a_id} AND name = 'cross_section'
+ </dc:statement>
+ <dc:elements>
+ <dc:context>
+ <dc:statement>
+ SELECT name AS facet_name, num as facet_num, description AS facet_description
+ FROM facets
+ WHERE out_id = ${out_id}
+ ORDER BY num ASC, name DESC
+ </dc:statement>
+ <longitudinal_section_columns>
<dc:attribute name="description" value="${river} ${a_creation}"/>
<dc:elements>
<dc:element name="${facet_name}">
@@ -1520,30 +1757,23 @@
<dc:attribute name="out" value="cross_section"/>
</dc:element>
</dc:elements>
- </longitudinal_section_columns>
- </dc:context>
- </dc:elements>
- </dc:context>
- </dc:elements>
+ </longitudinal_section_columns>
+ </dc:context>
+ </dc:elements>
+ </dc:context>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</waterlevels>
- </dc:if>
+ </dc:macro>
- <dc:if test="dc:contains($artifact-outs, 'longitudinal_section') or (dc:contains($artifact-outs, 'discharge_longitudinal_section') or (dc:contains($artifact-outs, 'w_differences')))">
+ <dc:macro name="longitudinal">
<waterlevels>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
- <dc:context>
+ <dc:call-macro name="range-filter">
+ <dc:context>
<dc:statement>
SELECT id AS out_id
FROM outs
@@ -1572,25 +1802,18 @@
</dc:context>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</waterlevels>
- </dc:if>
+ </dc:macro>
- <dc:if test="dc:contains($artifact-outs, 'fix_longitudinal_section_curve')">
+
+ <dc:macro name="longitudinal-section">
<waterlevels>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
-
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT id AS out_id
@@ -1621,25 +1844,17 @@
</dc:context>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</waterlevels>
- </dc:if>
+ </dc:macro>
- <dc:if test="dc:contains($artifact-outs, 'fix_deltawt_curve')">
+ <dc:macro name="delta-wt">
<waterlevels>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
-
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT id AS out_id
@@ -1669,25 +1884,18 @@
</dc:context>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</waterlevels>
- </dc:if>
+ </dc:macro>
- <dc:if test="dc:contains($artifact-outs, 'fix_derivate_curve')">
+
+ <dc:macro name="fix-derivate-curve">
<waterlevels>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
-
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT id AS out_id
@@ -1717,25 +1925,18 @@
</dc:context>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</waterlevels>
- </dc:if>
+ </dc:macro>
- <dc:if test="dc:contains($artifact-outs, 'fix_wq_curve')">
+
+ <dc:macro name="fix-wq-curve">
<waterlevels>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
-
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT id AS out_id
@@ -1765,24 +1966,18 @@
</dc:context>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</waterlevels>
- </dc:if>
+ </dc:macro>
- <dc:if test="dc:contains($artifact-outs, 'duration_curve')">
+
+ <dc:macro name="duration-curve">
<computed_discharge_curves>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT a.gid as aid, f.id AS fid, f.name AS facet_name, f.num AS facet_num, f.description as facet_description
@@ -1799,29 +1994,23 @@
</dc:element>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</computed_discharge_curves>
- </dc:if>
+ </dc:macro>
+
<dc:comment>
WATERLEVELS - ONLY SHOW Ws
</dc:comment>
<!-- TODO doesnt work nicely for fix/wq-diags. -->
- <dc:if test="dc:contains($artifact-outs, 'waterlevels') or (dc:contains($artifact-outs, 'fix_wq_curve'))">
+
+ <dc:macro name="waterlevels-fix">
<waterlevels>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
-
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT id AS out_id
@@ -1851,28 +2040,22 @@
</dc:context>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</waterlevels>
- </dc:if>
+ </dc:macro>
<dc:comment>
SHOW FLOODMAPS
</dc:comment>
- <dc:if test="dc:contains($artifact-outs, 'floodmap') or dc:contains($artifact-outs, 'map')">
+
+ <dc:macro name="flood-map">
<floodmap>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT a.gid as aid, f.id AS fid, f.name AS facet_name, f.num AS facet_num, f.description as facet_description
@@ -1889,28 +2072,21 @@
</dc:element>
</dc:elements>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</floodmap>
- </dc:if>
+ </dc:macro>
<dc:comment>
MINFO bedheight difference
</dc:comment>
- <dc:if test="dc:contains($artifact-outs, 'bed_difference_year') or dc:contains($artifact-outs, 'bed_difference_height_year')">
+ <dc:macro name="bed-difference">
<fix_longitudinal_section_curve>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT a.gid as aid, f.id AS fid, f.name AS facet_name, f.num AS facet_num, f.description as facet_description
@@ -1932,28 +2108,21 @@
</dc:elements>
</fix_longitudinal_section_curve>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</fix_longitudinal_section_curve>
- </dc:if>
+ </dc:macro>
<dc:comment>
MINFO bedheight middle
</dc:comment>
- <dc:if test="dc:contains($artifact-outs, 'bedheight_middle')">
+ <dc:macro name="bed-height">
<fix_vollmer_wq_curve>
<dc:elements>
<dc:context>
- <dc:statement>
- SELECT m.id AS a_id, m.state AS a_state, m.gid AS a_gid, m.creation AS a_creation
- FROM master_artifacts m
- WHERE m.collection_id = ${collection_id} AND m.gid <> CAST(${artifact-id} AS uuid)
- AND EXISTS (
- SELECT id FROM artifact_data ad WHERE ad.artifact_id = m.id AND k = 'river' AND v = ${river})
- </dc:statement>
- <dc:elements>
+ <dc:call-macro name="range-filter">
<dc:context>
<dc:statement>
SELECT a.gid as aid, f.id AS fid, f.name AS facet_name, f.num AS facet_num, f.description as facet_description
@@ -1974,13 +2143,90 @@
</dc:elements>
</fix_vollmer_wq_curve>
</dc:context>
- </dc:elements>
+ </dc:call-macro>
</dc:context>
</dc:elements>
</fix_vollmer_wq_curve>
+ </dc:macro>
+
+ <dc:macro name="floodmap-hws-user">
+ <dc:context>
+ <dc:statement>
+ SELECT id AS out_id
+ FROM outs
+ WHERE artifact_id = ${a_id} AND name = 'floodmap'
+ </dc:statement>
+ <dc:elements>
+ <dc:context>
+ <dc:statement>
+ SELECT name AS facet_name, num as facet_num, description AS facet_description
+ FROM facets
+ WHERE out_id = ${out_id} and name = 'floodmap.usershape'
+ ORDER BY num ASC, name DESC
+ </dc:statement>
+ <own-hws>
+ <dc:elements>
+ <dc:element name="${facet_name}">
+ <dc:attribute name="description" value="${facet_description}"/>
+ <dc:attribute name="ids" value="${facet_num}"/>
+ <dc:attribute name="factory" value="winfo"/>
+ <dc:attribute name="artifact-id" value="${a_gid}"/>
+ <dc:attribute name="out" value="floodmap"/>
+ </dc:element>
+ </dc:elements>
+ </own-hws>
+ </dc:context>
+ </dc:elements>
+ </dc:context>
+ </dc:macro>
+ <dc:if test="dc:contains($artifact-outs, 'longitudinal_section') or (dc:contains($artifact-outs, 'discharge_longitudinal_section') or (dc:contains($artifact-outs, 'w_differences')))">
+ <dc:call-macro name="longitudinal"/>
</dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'fix_deltawt_curve')">
+ <dc:call-macro name="delta-wt"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'longitudinal_section') or (dc:contains($artifact-outs, 'w_differences') or (dc:contains($artifact-outs, 'discharge_longitudinal_section')))">
+ <dc:call-macro name="differences"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'reference_curve')">
+ <dc:call-macro name="reference-curves"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'computed_discharge_curve')">
+ <dc:call-macro name="computed-discharge-curve"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'cross_section')">
+ <dc:call-macro name="waterlevels"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'fix_longitudinal_section_curve')">
+ <dc:call-macro name="longitudinal-section"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'fix_derivate_curve')">
+ <dc:call-macro name="fix-derivate-curve"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'fix_wq_curve')">
+ <dc:call-macro name="fix-wq-curve"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'duration_curve')">
+ <dc:call-macro name="duration-curve"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'waterlevels') or (dc:contains($artifact-outs, 'fix_wq_curve'))">
+ <dc:call-macro name="waterlevels-fix"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'floodmap') or dc:contains($artifact-outs, 'map')">
+ <dc:call-macro name="flood-map"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'bed_difference_year') or dc:contains($artifact-outs, 'bed_difference_height_year')">
+ <dc:call-macro name="bed-difference"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'bedheight_middle')">
+ <dc:call-macro name="bed-height"/>
+ </dc:if>
+ <dc:if test="dc:contains($artifact-outs, 'floodmap-hws')">
+ <dc:call-macro name="floodmap-hws-user"/>
+ </dc:if>
+ </dc:context>
+ </dc:call-macro>
- </dc:context>
</old_calculations>
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/rivermap.xml
--- a/flys-artifacts/doc/conf/rivermap.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/rivermap.xml Fri Mar 22 11:25:54 2013 +0100
@@ -27,4 +27,104 @@
<river-wms url="http://example.com/cgi-bin/river-wms" layers="Elbe"/>
<background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
</river>
+ <river name="Donau">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Donau"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Donaurna">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Donaurna"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="DonauSK">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="DonauSK"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Fulda">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Fulda"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Fulda-Sommer">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Fulda-Sommer"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Lahn">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Lahn"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Main">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Main"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Main-Wehrarm-Limbach">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Main-Wehrarm-Limbach"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Main-Wehrarm-Volkach">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Main-Wehrarm-Volkach"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Neckar">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Neckar"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Neckar-über-Wehrarme">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Neckar-über-Wehrarme"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Rhein">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Rhein"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Saale">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Saale"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Saale-Thüringen">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Saale-Thüringen"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Saar-Wilt-Bogen">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Saar-Wilt-Bogen"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Werra">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Werra"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Werra-Sommer">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Werra-Sommer"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Weser">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Weser"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Oder">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Oder"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
+ <river name="Havel">
+ <srid value="31467"/>
+ <river-wms url="http://example.com/cgi-bin/river-wms" layers="Havel"/>
+ <background-wms url="http://osm.intevation.de/mapcache/?" layers="OSM-WMS-Dienst"/>
+ </river>
</rivermap>
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/seddb-db.xml
--- a/flys-artifacts/doc/conf/seddb-db.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/seddb-db.xml Fri Mar 22 11:25:54 2013 +0100
@@ -6,4 +6,7 @@
<dialect>org.hibernate.dialect.PostgreSQLDialect</dialect>
<driver>org.postgresql.Driver</driver>
<url>jdbc:postgresql://localhost:5432/seddb</url>
+ <!--
+ <connection-init-sqls>ALTER SESSION SET CURRENT_SCHEMA=SEDDB</connection-init-sqls>
+ -->
</seddb-database>
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/themes.xml
--- a/flys-artifacts/doc/conf/themes.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/themes.xml Fri Mar 22 11:25:54 2013 +0100
@@ -173,6 +173,8 @@
<mapping from="longitudinal_section.q" pattern="(NQ)(\D.*)*"
to="LongitudinalSectionQ_NQ" />
<mapping from="longitudinal_section.q" to="LongitudinalSection" />
+ <mapping from="discharge_longitudinal_section.q.infolding" to="LongitudinalSectionQInfolding" />
+ <mapping from="discharge_longitudinal_section.q.cutting" to="LongitudinalSectionQInfoldCut" />
<mapping from="discharge_curve.curve" to="DischargeCurve" />
<mapping from="historical_discharge.historicalq" to="HistoricalDischargeCurveQ" />
@@ -201,7 +203,8 @@
<mapping from="floodmap.riveraxis" to="RiverAxis" />
<mapping from="floodmap.kms" to="Kms" />
<mapping from="floodmap.qps" to="Qps" />
- <mapping from="floodmap.hws" to="Hws" />
+ <mapping from="floodmap.hws_lines" to="Hws" />
+ <mapping from="floodmap.hws_points" to="HwsPoints" />
<mapping from="floodmap.hydr_boundaries" to="HydrBoundariesLines" />
<mapping from="floodmap.hydr_boundaries_poly" to="HydrBoundariesPolys" />
<mapping from="floodmap.catchment" to="Catchment" />
@@ -219,9 +222,9 @@
<mapping from="other.wqkms.q" to="WQKms" />
<mapping from="heightmarks_points" to="heightmarks_points" />
<mapping from="area" to="Area" />
- <mapping from="cross_section.area" to="Area" />
+ <mapping from="cross_section.area" to="CrossSectionArea" />
<mapping from="hyk" to="Hyk" />
- <mapping from="longitudinal_section.area" to="Area" />
+ <mapping from="longitudinal_section.area" to="LongitudinalSectionArea" />
<mapping from="longitudinal_section.manualpoints" to="ManualPoints" />
<mapping from="cross_section.manualpoints" to="ManualPoints" />
<mapping from="cross_section.manualline" to="CrossSectionWaterLine" />
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/themes/default/cross-section.xml
--- a/flys-artifacts/doc/conf/themes/default/cross-section.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/themes/default/cross-section.xml Fri Mar 22 11:25:54 2013 +0100
@@ -48,4 +48,22 @@
default="true" />
</fields>
</theme>
+
+ <theme name="CrossSectionArea">
+ <inherits>
+ <inherit from="Areas" />
+ </inherits>
+ <fields>
+ <field name="areashowbg" type="boolean" display="Hintergrund anzeigen"
+ default="true" hints="hidden" />
+ <field name="areashowborder" type="boolean" display="Begrenzung"
+ default="false" hints="hidden" />
+ <field name="areabordercolor" type="Color" default="0, 0, 0"
+ display="Begrenzungslinienfarbe" hints="hidden" />
+ <field name="showarea" type="boolean" display="Flaeche anzeigen"
+ default="true" hints="hidden" />
+ <field name="showarealabel" type="boolean"
+ display="Flächenbeschriftung anzeigen" default="false" hints="hidden" />
+ </fields>
+ </theme>
<!--/themegroup-->
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/themes/default/floodmap.xml
--- a/flys-artifacts/doc/conf/themes/default/floodmap.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/themes/default/floodmap.xml Fri Mar 22 11:25:54 2013 +0100
@@ -80,6 +80,14 @@
</inherits>
</theme>
+ <theme name="HwsPoints">
+ <inherits>
+ <inherit from="MapLines"/>
+ <inherit from="Label" />
+ <inherit from="Symbol" />
+ </inherits>
+ </theme>
+
<theme name="Catchment">
<inherits>
<inherit from="Map" />
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/themes/default/general.xml
--- a/flys-artifacts/doc/conf/themes/default/general.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/themes/default/general.xml Fri Mar 22 11:25:54 2013 +0100
@@ -124,5 +124,22 @@
</fields>
</theme>
+ <theme name="Area">
+ <inherits>
+ <inherit from="Areas" />
+ </inherits>
+ <fields>
+ <field name="areashowbg" type="boolean" display="Hintergrund anzeigen"
+ default="true" hints="hidden" />
+ <field name="areashowborder" type="boolean" display="Begrenzung"
+ default="false" hints="hidden" />
+ <field name="areabordercolor" type="Color" default="0, 0, 0"
+ display="Begrenzungslinienfarbe" hints="hidden" />
+ <field name="showarea" type="boolean" display="Flaeche anzeigen"
+ default="true" hints="hidden" />
+ <field name="showarealabel" type="boolean"
+ display="Flächenbeschriftung anzeigen" default="false" hints="hidden" />
+ </fields>
+ </theme>
<!--/themegroup-->
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/themes/default/longitudinal-section.xml
--- a/flys-artifacts/doc/conf/themes/default/longitudinal-section.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/themes/default/longitudinal-section.xml Fri Mar 22 11:25:54 2013 +0100
@@ -656,6 +656,26 @@
</fields>
</theme>
+ <theme name="LongitudinalSectionQInfoldCut">
+ <inherits>
+ <inherit from="LongitudinalSection" />
+ </inherits>
+ <fields>
+ <field name="linecolor" type="Color" display="Linienfarbe"
+ default="102, 102, 102" />
+ </fields>
+ </theme>
+
+ <theme name="LongitudinalSectionQInfolding">
+ <inherits>
+ <inherit from="LongitudinalSection" />
+ </inherits>
+ <fields>
+ <field name="linecolor" type="Color" display="Linienfarbe"
+ default="51, 51, 51" />
+ </fields>
+ </theme>
+
<!-- MIDDLE BED HEIGHT -->
<theme name="MiddleBedHeightSingle">
<inherits>
@@ -925,5 +945,21 @@
</fields>
</theme>
-
+ <theme name="LongitudinalSectionArea">
+ <inherits>
+ <inherit from="Areas" />
+ </inherits>
+ <fields>
+ <field name="areashowbg" type="boolean" display="Hintergrund anzeigen"
+ default="true" hints="hidden" />
+ <field name="areashowborder" type="boolean" display="Begrenzung"
+ default="false" hints="hidden" />
+ <field name="areabordercolor" type="Color" default="0, 0, 0"
+ display="Begrenzungslinienfarbe" hints="hidden" />
+ <field name="showarea" type="boolean" display="Flaeche anzeigen"
+ default="true" hints="hidden" />
+ <field name="showarealabel" type="boolean"
+ display="Flächenbeschriftung anzeigen" default="false" hints="hidden" />
+ </fields>
+ </theme>
<!--/themegroup>-->
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/themes/second/cross-section.xml
--- a/flys-artifacts/doc/conf/themes/second/cross-section.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/themes/second/cross-section.xml Fri Mar 22 11:25:54 2013 +0100
@@ -48,4 +48,22 @@
default="true" />
</fields>
</theme>
+
+ <theme name="CrossSectionArea">
+ <inherits>
+ <inherit from="Areas" />
+ </inherits>
+ <fields>
+ <field name="areashowbg" type="boolean" display="Hintergrund anzeigen"
+ default="true" hints="hidden" />
+ <field name="areashowborder" type="boolean" display="Begrenzung"
+ default="false" hints="hidden" />
+ <field name="areabordercolor" type="Color" default="0, 0, 0"
+ display="Begrenzungslinienfarbe" hints="hidden" />
+ <field name="showarea" type="boolean" display="Flaeche anzeigen"
+ default="true" hints="hidden" />
+ <field name="showarealabel" type="boolean"
+ display="Flächenbeschriftung anzeigen" default="false" hints="hidden" />
+ </fields>
+ </theme>
<!--/themegroup-->
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/themes/second/floodmap.xml
--- a/flys-artifacts/doc/conf/themes/second/floodmap.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/themes/second/floodmap.xml Fri Mar 22 11:25:54 2013 +0100
@@ -80,6 +80,14 @@
</inherits>
</theme>
+ <theme name="HwsPoints">
+ <inherits>
+ <inherit from="MapLines"/>
+ <inherit from="Label" />
+ <inherit from="Symbol" />
+ </inherits>
+ </theme>
+
<theme name="Catchment">
<inherits>
<inherit from="Map" />
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/themes/second/general.xml
--- a/flys-artifacts/doc/conf/themes/second/general.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/themes/second/general.xml Fri Mar 22 11:25:54 2013 +0100
@@ -124,5 +124,22 @@
</fields>
</theme>
+ <theme name="Area">
+ <inherits>
+ <inherit from="Areas" />
+ </inherits>
+ <fields>
+ <field name="areashowbg" type="boolean" display="Hintergrund anzeigen"
+ default="true" hints="hidden" />
+ <field name="areashowborder" type="boolean" display="Begrenzung"
+ default="false" hints="hidden" />
+ <field name="areabordercolor" type="Color" default="0, 0, 0"
+ display="Begrenzungslinienfarbe" hints="hidden" />
+ <field name="showarea" type="boolean" display="Flaeche anzeigen"
+ default="true" hints="hidden" />
+ <field name="showarealabel" type="boolean"
+ display="Flächenbeschriftung anzeigen" default="false" hints="hidden" />
+ </fields>
+ </theme>
<!--/themegroup-->
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/conf/themes/second/longitudinal-section.xml
--- a/flys-artifacts/doc/conf/themes/second/longitudinal-section.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/doc/conf/themes/second/longitudinal-section.xml Fri Mar 22 11:25:54 2013 +0100
@@ -925,5 +925,21 @@
</fields>
</theme>
-
+ <theme name="LongitudinalSectionArea">
+ <inherits>
+ <inherit from="Areas" />
+ </inherits>
+ <fields>
+ <field name="areashowbg" type="boolean" display="Hintergrund anzeigen"
+ default="true" hints="hidden" />
+ <field name="areashowborder" type="boolean" display="Begrenzung"
+ default="false" hints="hidden" />
+ <field name="areabordercolor" type="Color" default="0, 0, 0"
+ display="Begrenzungslinienfarbe" hints="hidden" />
+ <field name="showarea" type="boolean" display="Flaeche anzeigen"
+ default="true" hints="hidden" />
+ <field name="showarealabel" type="boolean"
+ display="Flächenbeschriftung anzeigen" default="false" hints="hidden" />
+ </fields>
+ </theme>
<!--/themegroup>-->
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/doc/howto_wmsartifact.txt
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/doc/howto_wmsartifact.txt Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,21 @@
+Howto add a wmsartifact for the maps:
+
+- Add artifact similar to the existing ones in:
+ flys-artifacts/src/main/java/de/intevation/flys/artifacts/
+- Define facet name in:
+ flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/FacetTypes.java
+- Modify datacage configuration (there is documentation in flys-artifacts/doc):
+ flys-artifacts/doc/conf/meta-data.xml
+- Define the Factory used in the datacage configuration in:
+ flys-artifacts/doc/conf/conf.xml
+- Define the facet for the Map:
+ flys-artifacts/doc/conf/artifacts/map.xml
+- You might also want to add it to the floodmap in winfo:
+ flys-artifacts/doc/conf/artifacts/winfo.xml
+- Add translations for the datacage elements in:
+ flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants.java
+- English localization:
+ flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants.properties
+- German localization:
+ flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants_de.properties
+
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/CollectionMonitor.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/CollectionMonitor.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/CollectionMonitor.java Fri Mar 22 11:25:54 2013 +0100
@@ -22,10 +22,12 @@
import de.intevation.flys.artifacts.datacage.Recommendations;
+/** Monitors collection changes. */
public class CollectionMonitor implements Hook {
public static final String XPATH_RESULT = "/art:result";
+
@Override
public void setup(Node cfg) {
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/MainValuesArtifact.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/MainValuesArtifact.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/MainValuesArtifact.java Fri Mar 22 11:25:54 2013 +0100
@@ -175,14 +175,15 @@
logger.debug("MainValuesArtifact.initialize");
FLYSArtifact winfo = (FLYSArtifact) artifact;
RangeAccess rangeAccess = new RangeAccess(winfo, null);
- double [] locations = rangeAccess.getLocations();
+ double [] locations = rangeAccess.getKmRange();
+
if (locations != null) {
double location = locations[0];
addData("ld_locations", new DefaultStateData("ld_locations", null, null,
String.valueOf(location)));
}
else {
- logger.warn("No location for mainvalues given.");
+ logger.error("No location for mainvalues given.");
}
importData(winfo, "river");
}
@@ -247,7 +248,13 @@
// TODO use helper to get location as double
String locationStr = getDataAsString("ld_locations");
- if (river == null || locationStr == null) {
+ if (river == null) {
+ logger.error("River is null");
+ return null;
+ }
+
+ if (locationStr == null) {
+ logger.error("Locationstr is null");
return null;
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/MapArtifact.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/MapArtifact.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/MapArtifact.java Fri Mar 22 11:25:54 2013 +0100
@@ -158,7 +158,7 @@
getID(), hash,
getUrl());
- String name = type + "-" + artifact.identifier();
+ String name = artifact.getDataAsString("river");
facet.addLayer(name);
facet.setExtent(getExtent(false));
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/RiverAxisArtifact.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/RiverAxisArtifact.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/RiverAxisArtifact.java Fri Mar 22 11:25:54 2013 +0100
@@ -145,7 +145,7 @@
@Override
protected String getDataString() {
if (FLYSUtils.isUsingOracle()) {
- return "geom FROM river_axes";
+ return "geom FROM river_axes USING SRID " + getSrid();
}
else {
return "geom FROM river_axes USING UNIQUE id";
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSCatchmentArtifact.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSCatchmentArtifact.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,166 +0,0 @@
-package de.intevation.flys.artifacts;
-
-import java.util.List;
-
-import org.w3c.dom.Document;
-
-import org.apache.log4j.Logger;
-
-import com.vividsolutions.jts.geom.Envelope;
-
-import de.intevation.artifacts.ArtifactFactory;
-import de.intevation.artifacts.CallMeta;
-
-import de.intevation.artifactdatabase.state.DefaultOutput;
-import de.intevation.artifactdatabase.state.Facet;
-import de.intevation.artifactdatabase.state.State;
-
-import de.intevation.flys.model.Catchment;
-import de.intevation.flys.model.River;
-
-import de.intevation.flys.artifacts.model.FacetTypes;
-import de.intevation.flys.artifacts.model.RiverFactory;
-import de.intevation.flys.utils.FLYSUtils;
-import de.intevation.flys.utils.GeometryUtils;
-
-
-public class WMSCatchmentArtifact extends WMSDBArtifact {
-
- public static final String NAME = "catchment";
-
-
- private static final Logger logger =
- Logger.getLogger(WMSCatchmentArtifact.class);
-
-
- @Override
- public void setup(
- String identifier,
- ArtifactFactory factory,
- Object context,
- CallMeta callMeta,
- Document data)
- {
- logger.debug("WMSCatchmentArtifact.setup");
-
- super.setup(identifier, factory, context, callMeta, data);
- }
-
-
- @Override
- public String getName() {
- return NAME;
- }
-
-
- @Override
- public State getCurrentState(Object cc) {
- State s = new CatchmentState(this);
-
- List<Facet> fs = getFacets(getCurrentStateId());
-
- DefaultOutput o = new DefaultOutput(
- "floodmap",
- "floodmap",
- "image/png",
- fs,
- "map");
-
- s.getOutputs().add(o);
-
- return s;
- }
-
-
- public static class CatchmentState extends WMSDBState implements FacetTypes
- {
- private static final Logger logger =
- Logger.getLogger(CatchmentState.class);
-
- protected int riverId;
-
- public CatchmentState(WMSDBArtifact artifact) {
- super(artifact);
- riverId = 0;
- }
-
- public int getRiverId() {
- if (riverId == 0) {
- String ids = artifact.getDataAsString("ids");
- String[] parts = ids.split(";");
-
- try {
- riverId = Integer.parseInt(parts[0]);
- }
- catch (NumberFormatException nfe) {
- logger.error("Cannot parse river id from '" + ids + "'");
- }
- }
-
- return riverId;
- }
-
- @Override
- protected String getFacetType() {
- return FLOODMAP_CATCHMENT;
- }
-
- @Override
- protected String getUrl() {
- return FLYSUtils.getUserWMSUrl(artifact.identifier());
- }
-
- @Override
- protected String getSrid() {
- River river = RiverFactory.getRiver(getRiverId());
- return FLYSUtils.getRiverSrid(river.getName());
- }
-
- @Override
- protected Envelope getExtent(boolean reproject) {
- List<Catchment> catchments =
- Catchment.getCatchments(getRiverId(), getName());
-
- Envelope max = null;
-
- for (Catchment c: catchments) {
- Envelope env = c.getGeom().getEnvelopeInternal();
-
- if (max == null) {
- max = env;
- continue;
- }
-
- max.expandToInclude(env);
- }
-
- return max != null && reproject
- ? GeometryUtils.transform(max, getSrid())
- : max;
- }
-
- @Override
- protected String getFilter() {
- return "river_id=" + String.valueOf(getRiverId())
- + " AND name='" + getName() + "'";
- }
-
- @Override
- protected String getDataString() {
- String srid = getSrid();
-
- if (FLYSUtils.isUsingOracle()) {
- return "geom FROM catchment USING SRID " + srid;
- }
- else {
- return "geom FROM catchment USING UNIQUE id USING SRID " + srid;
- }
- }
-
- @Override
- protected String getGeometryType() {
- return "POLYGON";
- }
- } // end of WMSKmState
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSHWSLinesArtifact.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSHWSLinesArtifact.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,165 @@
+package de.intevation.flys.artifacts;
+
+import java.util.List;
+
+import org.w3c.dom.Document;
+
+import org.apache.log4j.Logger;
+
+import com.vividsolutions.jts.geom.Envelope;
+
+import de.intevation.artifacts.ArtifactFactory;
+import de.intevation.artifacts.CallMeta;
+
+import de.intevation.artifactdatabase.state.DefaultOutput;
+import de.intevation.artifactdatabase.state.Facet;
+import de.intevation.artifactdatabase.state.State;
+
+import de.intevation.flys.model.HWSLine;
+import de.intevation.flys.model.River;
+
+import de.intevation.flys.artifacts.model.FacetTypes;
+import de.intevation.flys.artifacts.model.RiverFactory;
+import de.intevation.flys.utils.FLYSUtils;
+import de.intevation.flys.utils.GeometryUtils;
+
+
+public class WMSHWSLinesArtifact extends WMSDBArtifact {
+
+ public static final String NAME = "hws_lines";
+
+
+ private static final Logger logger =
+ Logger.getLogger(WMSHWSLinesArtifact.class);
+
+
+ @Override
+ public void setup(
+ String identifier,
+ ArtifactFactory factory,
+ Object context,
+ CallMeta callMeta,
+ Document data)
+ {
+ logger.debug("WMSHWSLinesArtifact.setup");
+
+ super.setup(identifier, factory, context, callMeta, data);
+ }
+
+
+ @Override
+ public String getName() {
+ return NAME;
+ }
+
+
+ @Override
+ public State getCurrentState(Object cc) {
+ State s = new HWSLinesState(this);
+
+ List<Facet> fs = getFacets(getCurrentStateId());
+
+ DefaultOutput o = new DefaultOutput(
+ "floodmap",
+ "floodmap",
+ "image/png",
+ fs,
+ "map");
+
+ s.getOutputs().add(o);
+
+ return s;
+ }
+
+
+ public static class HWSLinesState extends WMSDBState implements FacetTypes
+ {
+ private static final Logger logger =
+ Logger.getLogger(HWSLinesState.class);
+
+ protected int riverId;
+
+ public HWSLinesState(WMSDBArtifact artifact) {
+ super(artifact);
+ riverId = 0;
+ }
+
+ public int getRiverId() {
+ if (riverId == 0) {
+ String ids = artifact.getDataAsString("ids");
+ String[] parts = ids.split(";");
+
+ try {
+ riverId = Integer.parseInt(parts[0]);
+ }
+ catch (NumberFormatException nfe) {
+ logger.error("Cannot parse river id from '" + parts[0] + "'");
+ }
+ }
+
+ return riverId;
+ }
+
+ @Override
+ protected String getFacetType() {
+ return FLOODMAP_HWS_LINES;
+ }
+
+ @Override
+ protected String getUrl() {
+ return FLYSUtils.getUserWMSUrl(artifact.identifier());
+ }
+
+ @Override
+ protected String getSrid() {
+ River river = RiverFactory.getRiver(getRiverId());
+ return FLYSUtils.getRiverSrid(river.getName());
+ }
+
+ @Override
+ protected Envelope getExtent(boolean reproject) {
+ List<HWSLine> hws = HWSLine.getLines(getRiverId(), getName());
+
+ Envelope max = null;
+
+ for (HWSLine h: hws) {
+ Envelope env = h.getGeom().getEnvelopeInternal();
+
+ if (max == null) {
+ max = env;
+ continue;
+ }
+
+ max.expandToInclude(env);
+ }
+
+ return max != null && reproject
+ ? GeometryUtils.transform(max, getSrid())
+ : max;
+ }
+
+ @Override
+ protected String getFilter() {
+ return "river_id=" + String.valueOf(getRiverId())
+ + " AND name='" + getName() + "'";
+ }
+
+ @Override
+ protected String getDataString() {
+ String srid = getSrid();
+
+ if (FLYSUtils.isUsingOracle()) {
+ return "geom FROM hws_lines USING SRID " + srid;
+ }
+ else {
+ return "geom FROM hws_lines USING UNIQUE id USING SRID " + srid;
+ }
+ }
+
+ @Override
+ protected String getGeometryType() {
+ return "LINE";
+ }
+ }
+}
+// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSHWSPointsArtifact.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSHWSPointsArtifact.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,165 @@
+package de.intevation.flys.artifacts;
+
+import java.util.List;
+
+import org.w3c.dom.Document;
+
+import org.apache.log4j.Logger;
+
+import com.vividsolutions.jts.geom.Envelope;
+
+import de.intevation.artifacts.ArtifactFactory;
+import de.intevation.artifacts.CallMeta;
+
+import de.intevation.artifactdatabase.state.DefaultOutput;
+import de.intevation.artifactdatabase.state.Facet;
+import de.intevation.artifactdatabase.state.State;
+
+import de.intevation.flys.model.HWSPoint;
+import de.intevation.flys.model.River;
+
+import de.intevation.flys.artifacts.model.FacetTypes;
+import de.intevation.flys.artifacts.model.RiverFactory;
+import de.intevation.flys.utils.FLYSUtils;
+import de.intevation.flys.utils.GeometryUtils;
+
+
+public class WMSHWSPointsArtifact extends WMSDBArtifact {
+
+ public static final String NAME = "hws_points";
+
+
+ private static final Logger logger =
+ Logger.getLogger(WMSHWSPointsArtifact.class);
+
+
+ @Override
+ public void setup(
+ String identifier,
+ ArtifactFactory factory,
+ Object context,
+ CallMeta callMeta,
+ Document data)
+ {
+ logger.debug("WMSHWSPointsArtifact.setup");
+
+ super.setup(identifier, factory, context, callMeta, data);
+ }
+
+
+ @Override
+ public String getName() {
+ return NAME;
+ }
+
+
+ @Override
+ public State getCurrentState(Object cc) {
+ State s = new HWSPointsState(this);
+
+ List<Facet> fs = getFacets(getCurrentStateId());
+
+ DefaultOutput o = new DefaultOutput(
+ "floodmap",
+ "floodmap",
+ "image/png",
+ fs,
+ "map");
+
+ s.getOutputs().add(o);
+
+ return s;
+ }
+
+
+ public static class HWSPointsState extends WMSDBState implements FacetTypes
+ {
+ private static final Logger logger =
+ Logger.getLogger(HWSPointsState.class);
+
+ protected int riverId;
+
+ public HWSPointsState(WMSDBArtifact artifact) {
+ super(artifact);
+ riverId = 0;
+ }
+
+ public int getRiverId() {
+ if (riverId == 0) {
+ String ids = artifact.getDataAsString("ids");
+ String[] parts = ids.split(";");
+
+ try {
+ riverId = Integer.parseInt(parts[0]);
+ }
+ catch (NumberFormatException nfe) {
+ logger.error("Cannot parse river id from '" + parts[0] + "'");
+ }
+ }
+
+ return riverId;
+ }
+
+ @Override
+ protected String getFacetType() {
+ return FLOODMAP_HWS_POINTS;
+ }
+
+ @Override
+ protected String getUrl() {
+ return FLYSUtils.getUserWMSUrl(artifact.identifier());
+ }
+
+ @Override
+ protected String getSrid() {
+ River river = RiverFactory.getRiver(getRiverId());
+ return FLYSUtils.getRiverSrid(river.getName());
+ }
+
+ @Override
+ protected Envelope getExtent(boolean reproject) {
+ List<HWSPoint> hws = HWSPoint.getPoints(getRiverId(), getName());
+
+ Envelope max = null;
+
+ for (HWSPoint h: hws) {
+ Envelope env = h.getGeom().getEnvelopeInternal();
+
+ if (max == null) {
+ max = env;
+ continue;
+ }
+
+ max.expandToInclude(env);
+ }
+
+ return max != null && reproject
+ ? GeometryUtils.transform(max, getSrid())
+ : max;
+ }
+
+ @Override
+ protected String getFilter() {
+ return "river_id=" + String.valueOf(getRiverId())
+ + " AND name='" + getName() + "'";
+ }
+
+ @Override
+ protected String getDataString() {
+ String srid = getSrid();
+
+ if (FLYSUtils.isUsingOracle()) {
+ return "geom FROM hws_points USING SRID " + srid;
+ }
+ else {
+ return "geom FROM hws_points USING UNIQUE id USING SRID " + srid;
+ }
+ }
+
+ @Override
+ protected String getGeometryType() {
+ return "POINT";
+ }
+ }
+}
+// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSHwsArtifact.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSHwsArtifact.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,165 +0,0 @@
-package de.intevation.flys.artifacts;
-
-import java.util.List;
-
-import org.w3c.dom.Document;
-
-import org.apache.log4j.Logger;
-
-import com.vividsolutions.jts.geom.Envelope;
-
-import de.intevation.artifacts.ArtifactFactory;
-import de.intevation.artifacts.CallMeta;
-
-import de.intevation.artifactdatabase.state.DefaultOutput;
-import de.intevation.artifactdatabase.state.Facet;
-import de.intevation.artifactdatabase.state.State;
-
-import de.intevation.flys.model.River;
-import de.intevation.flys.model.Hws;
-
-import de.intevation.flys.artifacts.model.FacetTypes;
-import de.intevation.flys.artifacts.model.RiverFactory;
-import de.intevation.flys.utils.FLYSUtils;
-import de.intevation.flys.utils.GeometryUtils;
-
-
-public class WMSHwsArtifact extends WMSDBArtifact {
-
- public static final String NAME = "hws";
-
-
- private static final Logger logger =
- Logger.getLogger(WMSHwsArtifact.class);
-
-
- @Override
- public void setup(
- String identifier,
- ArtifactFactory factory,
- Object context,
- CallMeta callMeta,
- Document data)
- {
- logger.debug("WMSHwsArtifact.setup");
-
- super.setup(identifier, factory, context, callMeta, data);
- }
-
-
- @Override
- public String getName() {
- return NAME;
- }
-
-
- @Override
- public State getCurrentState(Object cc) {
- State s = new HwsState(this);
-
- List<Facet> fs = getFacets(getCurrentStateId());
-
- DefaultOutput o = new DefaultOutput(
- "floodmap",
- "floodmap",
- "image/png",
- fs,
- "map");
-
- s.getOutputs().add(o);
-
- return s;
- }
-
-
- public static class HwsState extends WMSDBState implements FacetTypes
- {
- private static final Logger logger =
- Logger.getLogger(HwsState.class);
-
- protected int riverId;
-
- public HwsState(WMSDBArtifact artifact) {
- super(artifact);
- riverId = 0;
- }
-
- public int getRiverId() {
- if (riverId == 0) {
- String ids = artifact.getDataAsString("ids");
- String[] parts = ids.split(";");
-
- try {
- riverId = Integer.parseInt(parts[0]);
- }
- catch (NumberFormatException nfe) {
- logger.error("Cannot parse river id from '" + parts[0] + "'");
- }
- }
-
- return riverId;
- }
-
- @Override
- protected String getFacetType() {
- return FLOODMAP_HWS;
- }
-
- @Override
- protected String getUrl() {
- return FLYSUtils.getUserWMSUrl(artifact.identifier());
- }
-
- @Override
- protected String getSrid() {
- River river = RiverFactory.getRiver(getRiverId());
- return FLYSUtils.getRiverSrid(river.getName());
- }
-
- @Override
- protected Envelope getExtent(boolean reproject) {
- List<Hws> hws = Hws.getHws(getRiverId(), getName());
-
- Envelope max = null;
-
- for (Hws h: hws) {
- Envelope env = h.getGeom().getEnvelopeInternal();
-
- if (max == null) {
- max = env;
- continue;
- }
-
- max.expandToInclude(env);
- }
-
- return max != null && reproject
- ? GeometryUtils.transform(max, getSrid())
- : max;
- }
-
- @Override
- protected String getFilter() {
- return "river_id=" + String.valueOf(getRiverId())
- + " AND name='" + getName() + "'";
- }
-
- @Override
- protected String getDataString() {
- String srid = getSrid();
-
- if (FLYSUtils.isUsingOracle()) {
- return "geom FROM hws USING SRID " + srid;
- }
- else {
- return "geom FROM hws USING UNIQUE id USING SRID " + srid;
- }
- }
-
- @Override
- protected String getGeometryType() {
- return "LINE";
- }
- } // end of WMSKmState
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSLineArtifact.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSLineArtifact.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/WMSLineArtifact.java Fri Mar 22 11:25:54 2013 +0100
@@ -16,7 +16,7 @@
import de.intevation.artifactdatabase.state.State;
import de.intevation.flys.model.River;
-import de.intevation.flys.model.Line;
+import de.intevation.flys.model.HWSLine;
import de.intevation.flys.artifacts.model.FacetTypes;
import de.intevation.flys.artifacts.model.RiverFactory;
@@ -118,11 +118,11 @@
@Override
protected Envelope getExtent(boolean reproject) {
- List<Line> lines = Line.getLines(getRiverId(), getName());
+ List<HWSLine> lines = HWSLine.getLines(getRiverId(), getName());
Envelope max = null;
- for (Line l: lines) {
+ for (HWSLine l: lines) {
Envelope env = l.getGeom().getEnvelopeInternal();
if (max == null) {
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/FixAccess.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/FixAccess.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/FixAccess.java Fri Mar 22 11:25:54 2013 +0100
@@ -7,6 +7,7 @@
import org.apache.log4j.Logger;
+/** Access for Fixation related data. */
public class FixAccess
extends RangeAccess
{
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/FixRealizingAccess.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/FixRealizingAccess.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/FixRealizingAccess.java Fri Mar 22 11:25:54 2013 +0100
@@ -9,6 +9,8 @@
import org.apache.log4j.Logger;
+
+/** Fix-Realizing (Volmer/Ausgelagerte Wasserspiegellage) access. */
public class FixRealizingAccess
extends FixAccess
{
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/MapAccess.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/MapAccess.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,27 @@
+package de.intevation.flys.artifacts.access;
+
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.List;
+
+import de.intevation.artifacts.CallContext;
+import de.intevation.flys.artifacts.FLYSArtifact;
+
+
+public class MapAccess
+extends RangeAccess
+{
+
+ public MapAccess(FLYSArtifact artifact, CallContext context) {
+ super(artifact, context);
+ }
+
+ public List<String> getHWS() {
+ String param = getString("uesk.hws");
+ if (param != null) {
+ String[] split = param.split(";");
+ return new ArrayList<String>(Arrays.asList(split));
+ }
+ return new ArrayList<String>();
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/RangeAccess.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/RangeAccess.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/access/RangeAccess.java Fri Mar 22 11:25:54 2013 +0100
@@ -57,7 +57,7 @@
else {
mode = KM_MODE.NONE;
}
-
+
return mode;
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/context/FLYSContextFactory.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/context/FLYSContextFactory.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/context/FLYSContextFactory.java Fri Mar 22 11:25:54 2013 +0100
@@ -83,6 +83,9 @@
private static final String XPATH_ZOOM_SCALES = "/artifact-database/options/zoom-scales/zoom-scale";
+ private static final String XPATH_DGM_PATH = "/artifact-database/options/dgm-path/text()";
+
+
/**
* Creates a new FLYSArtifactContext object and initialize all
* components required by the application.
@@ -102,11 +105,22 @@
configureFloodmapWMS(config, context);
configureModules(config, context);
configureZoomScales(config, context);
+ configureDGMPath(config, context);
return context;
}
+ private void configureDGMPath(Document config, FLYSContext context) {
+ String dgmPath = (String) XMLUtils.xpath(
+ config,
+ XPATH_DGM_PATH,
+ XPathConstants.STRING);
+
+ context.put("dgm-path", dgmPath);
+ }
+
+
protected void configureZoomScales(Document config, FLYSContext context) {
NodeList list = (NodeList)XMLUtils.xpath(
config,
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/datacage/templating/Builder.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/datacage/templating/Builder.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/datacage/templating/Builder.java Fri Mar 22 11:25:54 2013 +0100
@@ -236,6 +236,36 @@
}
}
+
+ protected ResultData createFilteredResultData(ResultData rd, String filter) {
+ if (filter == null) return rd;
+
+ List<Object []> rows = rd.getRows();
+ String [] columns = rd.getColumnLabels();
+
+ List<Object []> filtered = new ArrayList<Object[]>(rows.size());
+
+ for (Object [] row: rows) {
+ frames.enter();
+ try {
+ frames.put(columns, row);
+ boolean traverse = filter == null;
+
+ if (!traverse) {
+ Boolean b = evaluateXPathToBoolean(filter);
+ traverse = b != null && b;
+ }
+ if (traverse) {
+ filtered.add(row);
+ }
+ }
+ finally {
+ frames.leave();
+ }
+ }
+ return new ResultData(rd.getColumnLabels(), filtered);
+ }
+
/**
* Kind of foreach over results of a statement within a context.
*/
@@ -249,6 +279,12 @@
return;
}
+ String filter = current.getAttribute("filter");
+
+ if ((filter = filter.trim()).length() == 0) {
+ filter = null;
+ }
+
NodeList subs = current.getChildNodes();
int S = subs.getLength();
@@ -257,29 +293,45 @@
return;
}
+ Pair<Builder.NamedConnection, ResultData> pair =
+ connectionsStack.peek();
+
ResultData rd = connectionsStack.peek().getB();
+ ResultData orig = rd;
- String [] columns = rd.getColumnLabels();
+ if (filter != null) {
+ ResultData rdCopy = createFilteredResultData(rd, filter);
+ pair.setB(rdCopy);
+ rd = rdCopy;
+ }
+ try {
+ String [] columns = rd.getColumnLabels();
- //if (log.isDebugEnabled()) {
- // log.debug("pushing vars: "
- // + java.util.Arrays.toString(columns));
- //}
+ //if (log.isDebugEnabled()) {
+ // log.debug("pushing vars: "
+ // + java.util.Arrays.toString(columns));
+ //}
- for (Object [] row: rd.getRows()) {
- frames.enter();
- try {
- frames.put(columns, row);
- //if (log.isDebugEnabled()) {
- // log.debug("current vars: " + frames.dump());
- //}
- for (int i = 0; i < S; ++i) {
- build(parent, subs.item(i));
+ for (Object [] row: rd.getRows()) {
+ frames.enter();
+ try {
+ frames.put(columns, row);
+ //if (log.isDebugEnabled()) {
+ // log.debug("current vars: " + frames.dump());
+ //}
+ for (int i = 0; i < S; ++i) {
+ build(parent, subs.item(i));
+ }
+ }
+ finally {
+ frames.leave();
}
}
- finally {
- frames.leave();
- }
+ }
+ finally {
+ if (filter != null) {
+ pair.setB(orig);
+ }
}
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/datacage/templating/FunctionResolver.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/datacage/templating/FunctionResolver.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/datacage/templating/FunctionResolver.java Fri Mar 22 11:25:54 2013 +0100
@@ -93,17 +93,17 @@
Object from = args.get(2);
if (!(mode instanceof String)){
- return -Double.MAX_VALUE;
+ return -99999d;
}
if (mode.equals("locations")) {
if (!(locations instanceof String)) {
- return -Double.MAX_VALUE;
+ return -99999d;
}
String loc = ((String)locations).replace(" ", "");
String[] split = loc.split(",");
if (split.length < 1) {
- return -Double.MAX_VALUE;
+ return -99999d;
}
try {
double min = Double.parseDouble(split[0]);
@@ -116,23 +116,23 @@
return min;
}
catch (NumberFormatException nfe) {
- return -Double.MAX_VALUE;
+ return -99999d;
}
}
else if (mode.equals("distance")) {
if (!(from instanceof String)) {
- return -Double.MAX_VALUE;
+ return -99999d;
}
String f = (String)from;
try {
return Double.parseDouble(f);
}
catch(NumberFormatException nfe) {
- return -Double.MAX_VALUE;
+ return -99999d;
}
}
else {
- return -Double.MAX_VALUE;
+ return -99999d;
}
}
});
@@ -149,18 +149,18 @@
Object to = args.get(2);
if (!(mode instanceof String)){
- return Double.MAX_VALUE;
+ return 99999d;
}
if (mode.equals("locations")) {
if (!(locations instanceof String)) {
- return Double.MAX_VALUE;
+ return 99999d;
}
try {
String loc = ((String)locations).replace(" ", "");
String[] split = loc.split(",");
if (split.length < 1) {
- return Double.MAX_VALUE;
+ return 99999d;
}
double max = Double.parseDouble(split[0]);
for (int i = 1; i < split.length; ++i) {
@@ -172,12 +172,12 @@
return max;
}
catch (NumberFormatException nfe) {
- return Double.MAX_VALUE;
+ return 99999d;
}
}
else if (mode.equals("distance")) {
if (!(to instanceof String)) {
- return Double.MAX_VALUE;
+ return 99999d;
}
else {
String t = (String)to;
@@ -185,12 +185,12 @@
return Double.parseDouble(t);
}
catch(NumberFormatException nfe) {
- return Double.MAX_VALUE;
+ return 99999d;
}
}
}
else {
- return Double.MAX_VALUE;
+ return 99999d;
}
}
});
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/datacage/templating/ResultData.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/datacage/templating/ResultData.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/datacage/templating/ResultData.java Fri Mar 22 11:25:54 2013 +0100
@@ -26,6 +26,11 @@
rows = new ArrayList<Object []>();
}
+ public ResultData(String [] columns, List<Object []> rows) {
+ this.columns = columns;
+ this.rows = rows;
+ }
+
public ResultData(ResultSetMetaData meta)
throws SQLException
{
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/Calculation4.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/Calculation4.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/Calculation4.java Fri Mar 22 11:25:54 2013 +0100
@@ -323,9 +323,7 @@
int numProblemsBefore = numProblems();
double [] qs = qf.findQs(kms, this);
- // TODO: i18n
- String name = "Umh\u00fcllende " + columns[i].getName();
-
+ String name = columns[i].getName();
ConstantWQKms infolding = new ConstantWQKms(kms, qs, ws, name);
if (numProblems() > numProblemsBefore) {
@@ -335,6 +333,19 @@
infoldings.add(infolding);
}
+ for (int i = 0; i < infoldings.size(); i++) {
+ String name = infoldings.get(i).getName();
+ // TODO: i18n
+ if (i == 0) {
+ infoldings.get(i).setName("untere Umh\u00fcllende " + name);
+ }
+ else if (i == infoldings.size() - 1) {
+ infoldings.get(i).setName("obere Umh\u00fcllende " + name);
+ }
+ else {
+ infoldings.get(i).setName("geschnitten " + name);
+ }
+ }
return infoldings.toArray(new ConstantWQKms[infoldings.size()]);
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/Calculation6.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/Calculation6.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/Calculation6.java Fri Mar 22 11:25:54 2013 +0100
@@ -13,6 +13,7 @@
/**
+ * Historical Discharge Calculation.
* @author <a href="mailto:ingo.weinzierl at intevation.de">Ingo Weinzierl</a>
*/
public class Calculation6 extends Calculation {
@@ -101,6 +102,7 @@
return relevant;
}
+ /** True if timerange of given discharge table overlaps with timerange. */
protected boolean isDischargeTableRelevant(DischargeTable dt) {
TimeInterval ti = dt.getTimeInterval();
@@ -173,6 +175,7 @@
String.valueOf(km), dt.getTimeInterval());
}
+ /** Without reference. */
protected WQTimerange[] prepareSimpleData(List<DischargeTable> dts) {
List<WQTimerange> wqts = new ArrayList<WQTimerange>(values.length);
@@ -220,6 +223,7 @@
return wqts.toArray(new WQTimerange[wqts.size()]);
}
+ /** With reference. */
protected HistoricalWQTimerange[] prepareData(DischargeTable refTable,
List<DischargeTable> dts) {
List<HistoricalWQTimerange> wqts = new ArrayList<HistoricalWQTimerange>(
@@ -295,6 +299,7 @@
.toArray(new HistoricalWQTimerange[wqts.size()]);
}
+ /** Returns discharge table interval as Date[]. */
protected Date[] prepareTimeInterval(DischargeTable dt) {
TimeInterval ti = dt.getTimeInterval();
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/DischargeTables.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/DischargeTables.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/DischargeTables.java Fri Mar 22 11:25:54 2013 +0100
@@ -26,10 +26,10 @@
/** Private logger. */
private static Logger log = Logger.getLogger(DischargeTables.class);
- /** Scale to convert discharge table values of master table into [cm] */
+ /** Scale to convert discharge table values of master table into [cm]. */
public static final double MASTER_SCALE = 100d;
- /** Scale to convert discharge table values of historical tables into [cm] */
+ /** Scale to convert discharge table values of historical tables into [cm]. */
public static final double HISTORICAL_SCALE = 1d;
public static final int MASTER = 0;
@@ -184,6 +184,11 @@
return x > a && x < b;
}
+ /**
+ * Find or interpolate Qs from q/w array.
+ * @param values [[q0,q1,q2],[w0,w1,w2]]
+ * @param w W value to look for in values.
+ */
public static double [] getQsForW(double [][] values, double w) {
boolean debug = log.isDebugEnabled();
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/FacetTypes.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/FacetTypes.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/FacetTypes.java Fri Mar 22 11:25:54 2013 +0100
@@ -161,7 +161,8 @@
String FLOODMAP_WMSBACKGROUND = "floodmap.wmsbackground";
String FLOODMAP_KMS = "floodmap.kms";
String FLOODMAP_QPS = "floodmap.qps";
- String FLOODMAP_HWS = "floodmap.hws";
+ String FLOODMAP_HWS_LINES = "floodmap.hws_lines";
+ String FLOODMAP_HWS_POINTS = "floodmap.hws_points";
String FLOODMAP_HYDR_BOUNDARY = "floodmap.hydr_boundaries";
String FLOODMAP_HYDR_BOUNDARY_POLY = "floodmap.hydr_boundaries_poly";
String FLOODMAP_CATCHMENT = "floodmap.catchment";
@@ -175,6 +176,8 @@
String DISCHARGE_LONGITUDINAL_W = "discharge_longitudinal_section.w";
String DISCHARGE_LONGITUDINAL_Q = "discharge_longitudinal_section.q";
+ String DISCHARGE_LONGITUDINAL_Q_INFOLD = "discharge_longitudinal_section.q.infolding";
+ String DISCHARGE_LONGITUDINAL_Q_INFOLD_CUT = "discharge_longitudinal_section.q.cutting";
String DISCHARGE_LONGITUDINAL_C = "discharge_longitudinal_section.c";
String LONGITUDINAL_W = "longitudinal_section.w";
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/FixingsOverview.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/FixingsOverview.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/FixingsOverview.java Fri Mar 22 11:25:54 2013 +0100
@@ -20,6 +20,8 @@
import org.w3c.dom.Document;
import org.w3c.dom.Element;
+
+/** Generate Fixings Table chart. */
public class FixingsOverview
implements Serializable
{
@@ -644,11 +646,11 @@
public boolean accept(Fixing.Column column) {
for (SectorRange s: column.getSectors()) {
int v = s.getSector();
- if (v >= min && v <= max) {
- return true;
+ if (v < min || v > max) {
+ return false;
}
}
- return false;
+ return true;
}
} // class SectorRangeFilter
@@ -728,6 +730,7 @@
return gauges;
}
+ /** Populate document with fixings, filtered by range and filter. */
public void generateOverview(
Document document,
Range range,
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/FlowVelocityCalculation.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/FlowVelocityCalculation.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/FlowVelocityCalculation.java Fri Mar 22 11:25:54 2013 +0100
@@ -1,9 +1,5 @@
package de.intevation.flys.artifacts.model;
-import java.util.ArrayList;
-import java.util.Collections;
-import java.util.List;
-
import de.intevation.artifacts.Artifact;
import de.intevation.flys.artifacts.access.FlowVelocityAccess;
@@ -11,7 +7,10 @@
import de.intevation.flys.model.DischargeZone;
import de.intevation.flys.model.FlowVelocityModel;
import de.intevation.flys.model.FlowVelocityModelValue;
-import de.intevation.flys.model.River;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.List;
import org.apache.log4j.Logger;
@@ -97,21 +96,11 @@
return Collections.<FlowVelocityModel>emptyList();
}
- River river = RiverFactory.getRiver(riverName);
- if (river == null) {
- logger.warn("No such river: " + riverName);
- return Collections.<FlowVelocityModel>emptyList();
- }
-
List<FlowVelocityModel> models = new ArrayList<FlowVelocityModel>();
for (DischargeZone zone: zones) {
- List<FlowVelocityModel> model =
- FlowVelocityModel.getModels(river, zone);
-
- if (model != null) {
- models.addAll(model);
- }
+ List<FlowVelocityModel> model = FlowVelocityModel.getModels(zone);
+ models.addAll(model);
}
return models;
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/GaugeFinder.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/GaugeFinder.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/GaugeFinder.java Fri Mar 22 11:25:54 2013 +0100
@@ -11,6 +11,7 @@
import org.hibernate.type.StandardBasicTypes;
+/** Find Gauges and respective Q main values. */
public class GaugeFinder
implements Serializable
{
@@ -62,6 +63,8 @@
this.isKmUp = isKmUp;
}
+
+ /** Find GaugeRange at kilometer. */
public GaugeRange find(double km) {
for (GaugeRange gauge: gauges) {
if (gauge.inside(km)) {
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/GaugeFinderFactory.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/GaugeFinderFactory.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/GaugeFinderFactory.java Fri Mar 22 11:25:54 2013 +0100
@@ -19,6 +19,7 @@
import org.hibernate.type.StandardBasicTypes;
+/** Get GaugeFinders. */
public class GaugeFinderFactory
{
private static Logger log = Logger.getLogger(GaugeFinderFactory.class);
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/GaugeRange.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/GaugeRange.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/GaugeRange.java Fri Mar 22 11:25:54 2013 +0100
@@ -9,6 +9,9 @@
import org.apache.log4j.Logger;
+/**
+ * Gauge, km-range, main values.
+ */
public class GaugeRange
extends Range
{
@@ -29,16 +32,21 @@
protected int gaugeId;
+ /** Certain main value. */
protected Map<String, Double> mainValues;
+
protected List<Sector> sectors;
+
public GaugeRange() {
}
+
public GaugeRange(double start, double end, int gaugeId) {
this(start, end, null, gaugeId);
}
+
public GaugeRange(
double start,
double end,
@@ -52,6 +60,7 @@
sectors = new ArrayList<Sector>(3);
}
+
public void addMainValue(String label, Double value) {
int idx = label.indexOf('(');
if (idx >= 0) {
@@ -60,6 +69,7 @@
mainValues.put(label, value);
}
+
protected Double getMainValue(String label) {
Double v = mainValues.get(label);
if (v == null) {
@@ -69,6 +79,12 @@
return v;
}
+
+ public Map<String, Double> getMainValues() {
+ return mainValues;
+ }
+
+
public void buildClasses() {
Double mnq = getMainValue("MNQ");
Double mq = getMainValue("MQ");
@@ -89,6 +105,7 @@
}
}
+
public double getSectorBorder(int sector) {
for (Sector s: sectors) {
if (s.sector == sector) {
@@ -98,6 +115,7 @@
return Double.NaN;
}
+
public int classify(double value) {
for (Sector sector: sectors) {
if (value < sector.value) {
@@ -107,14 +125,17 @@
return sectors.size();
}
+
public String getName() {
return name;
}
+
public void setName(String name) {
this.name = name;
}
+
public String toString() {
StringBuilder sb = new StringBuilder("sectors: [");
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalDischargeData.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalDischargeData.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalDischargeData.java Fri Mar 22 11:25:54 2013 +0100
@@ -24,3 +24,4 @@
return wqs;
}
}
+// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalDischargeDifferenceFacet.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalDischargeDifferenceFacet.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalDischargeDifferenceFacet.java Fri Mar 22 11:25:54 2013 +0100
@@ -10,6 +10,7 @@
/**
+ * Difference of historical discharge curve to ...
* @author <a href="mailto:ingo.weinzierl at intevation.de">Ingo Weinzierl</a>
*/
public class HistoricalDischargeDifferenceFacet
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalWQKms.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalWQKms.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalWQKms.java Fri Mar 22 11:25:54 2013 +0100
@@ -20,3 +20,4 @@
return timeInterval;
}
}
+// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalWQTimerange.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalWQTimerange.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/HistoricalWQTimerange.java Fri Mar 22 11:25:54 2013 +0100
@@ -9,7 +9,7 @@
/**
* A subclass of WQTimerange that stores besides W, Q and Timerange values
- * another double value.
+ * another double value (difference to something).
*
* @author <a href="mailto:ingo.weinzierl at intevation.de">Ingo Weinzierl</a>
*/
@@ -50,7 +50,7 @@
public void add(double w, double q, double diff, Timerange t) {
ws.add(w);
qs.add(q);
- ts.add(t);
+ timeranges.add(t);
diffs.add(diff);
}
@@ -74,7 +74,7 @@
@Override
public List<TimerangeItem> sort() {
- ArrayList<TimerangeItem> items = new ArrayList<TimerangeItem>(ts.size());
+ ArrayList<TimerangeItem> items = new ArrayList<TimerangeItem>(timeranges.size());
for (int i = 0, n = size(); i < n; i++) {
items.add(new HistoricalTimerangeItem(getTimerange(i), getQ(i), getW(i), diffs.get(i)));
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/MiddleBedHeightFacet.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/MiddleBedHeightFacet.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/MiddleBedHeightFacet.java Fri Mar 22 11:25:54 2013 +0100
@@ -11,6 +11,7 @@
import org.apache.log4j.Logger;
+
/**
* Facet of a MiddleBedHeight curve.
*/
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/QRangeTree.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/QRangeTree.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/QRangeTree.java Fri Mar 22 11:25:54 2013 +0100
@@ -151,8 +151,8 @@
}
public double [] findQs(
- double [] kms,
- double [] qs,
+ double [] kms,
+ double [] qs,
Calculation report
) {
for (int i = 0; i < kms.length; ++i) {
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/Segment.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/Segment.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/Segment.java Fri Mar 22 11:25:54 2013 +0100
@@ -15,6 +15,7 @@
import org.apache.log4j.Logger;
+/** A Range with values and a reference point. */
public class Segment
implements Serializable
{
@@ -53,6 +54,7 @@
return from < to;
}
+ /** Checks whether given km lies inside the to/from bounds of this segment. */
public boolean inside(double km) {
return from < to
? km >= from && km <= to
@@ -120,6 +122,7 @@
return referencePoint;
}
+ /** Use DoubleUtil to parse Segments. */
public static List<Segment> parseSegments(String input) {
final List<Segment> segments = new ArrayList<Segment>();
@@ -176,18 +179,21 @@
DischargeTable dt = gauge.fetchMasterDischargeTable();
+ //TODO: Change scale from 100 to 1 immediately after
+ // discharge table import changed to cm!
double [][] table =
- DischargeTables.loadDischargeTableValues(dt, 1);
+ DischargeTables.loadDischargeTableValues(dt, 100);
// need the original values for naming
segment.backup();
for (int i = 0; i < values.length; ++i) {
- double w = values[i] / 100.0;
+ //TODO: s.o.
+ double w = values[i]; /* / 100.0; */
double [] qs = DischargeTables.getQsForW(table, w);
if (qs.length == 0) {
log.warn("No Qs found for W = " + values[i]);
- report.addProblem("cannot.find.w.for.q", values[i]);
+ report.addProblem("cannot.find.q.for.w", values[i]);
values[i] = Double.NaN;
success = false;
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/WQTimerange.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/WQTimerange.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/WQTimerange.java Fri Mar 22 11:25:54 2013 +0100
@@ -6,10 +6,12 @@
/**
+ * A collection of triples W,Q,Timerange.
* @author <a href="mailto:ingo.weinzierl at intevation.de">Ingo Weinzierl</a>
*/
public class WQTimerange extends WQ {
+ /** Used to sort <w,q,timerange> triples. */
public static class TimerangeItem implements Comparable<TimerangeItem> {
public double q;
public double w;
@@ -21,6 +23,7 @@
this.w = w;
}
+ /** Sets [w,q] in wq. */
public double[] get(double[] wq) {
if (wq.length >= 2) {
wq[0] = w;
@@ -50,7 +53,7 @@
}
}
- protected List<Timerange> ts;
+ protected List<Timerange> timeranges;
public WQTimerange() {
@@ -60,28 +63,28 @@
public WQTimerange(String name) {
super(name);
- ts = new ArrayList<Timerange>();
+ timeranges = new ArrayList<Timerange>();
}
public void add(double w, double q, Timerange t) {
ws.add(w);
qs.add(q);
- ts.add(t);
+ timeranges.add(t);
}
public Timerange getTimerange(int idx) {
- return ts.get(idx);
+ return timeranges.get(idx);
}
public Timerange[] getTimeranges() {
- return ts.toArray(new Timerange[ts.size()]);
+ return timeranges.toArray(new Timerange[timeranges.size()]);
}
public List<TimerangeItem> sort() {
- ArrayList<TimerangeItem> items = new ArrayList<TimerangeItem>(ts.size());
+ ArrayList<TimerangeItem> items = new ArrayList<TimerangeItem>(timeranges.size());
for (int i = 0, n = size(); i < n; i++) {
items.add(new TimerangeItem(getTimerange(i), getQ(i), getW(i)));
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/fixings/FixCalculation.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/fixings/FixCalculation.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/fixings/FixCalculation.java Fri Mar 22 11:25:54 2013 +0100
@@ -32,6 +32,7 @@
import org.apache.log4j.Logger;
+/** Calculation base class for fix. */
public abstract class FixCalculation
extends Calculation
{
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/fixings/FixRealizingCalculation.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/fixings/FixRealizingCalculation.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/fixings/FixRealizingCalculation.java Fri Mar 22 11:25:54 2013 +0100
@@ -17,6 +17,7 @@
import org.apache.log4j.Logger;
+/** Calculation for FixRealize (german: ausgel. WSPL). */
public class FixRealizingCalculation
extends FixCalculation
{
@@ -128,7 +129,7 @@
}
}
- // name the curves
+ // Name the curves.
for (int i = 0; i < results.length; ++i) {
results[i].setName(createName(i));
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/fixings/FixReferenceEventsFacet.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/fixings/FixReferenceEventsFacet.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/fixings/FixReferenceEventsFacet.java Fri Mar 22 11:25:54 2013 +0100
@@ -64,6 +64,8 @@
FixResult result = (FixResult) res.getData();
double currentKm = getCurrentKm(context);
+ logger.debug("current km in FRE: " + currentKm);
+
KMIndex<QWD []> kmQWs = result.getReferenced();
KMIndex.Entry<QWD []> kmQWsEntry = kmQWs.binarySearch(currentKm);
QWD[] qwds = null;
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/map/HWS.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/map/HWS.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,153 @@
+package de.intevation.flys.artifacts.model.map;
+
+import org.geotools.feature.simple.SimpleFeatureBuilder;
+import org.opengis.feature.simple.SimpleFeature;
+import org.opengis.feature.simple.SimpleFeatureType;
+
+import com.vividsolutions.jts.geom.Geometry;
+
+import de.intevation.flys.artifacts.model.NamedObjectImpl;
+import de.intevation.flys.utils.GeometryUtils;
+
+
+public class HWS
+extends NamedObjectImpl
+{
+
+ public enum TYPE {LINE, POINT};
+
+ private Geometry geom;
+ private String id;
+ private int kind;
+ private int official;
+ private String fedState;
+ private String description;
+ private TYPE type;
+
+ public HWS() {
+ this.geom = null;
+ // TODO Auto-generated constructor stub
+ }
+
+ public HWS(String name) {
+ super(name);
+ this.geom = null;
+ }
+
+ public HWS(
+ String name,
+ Geometry geom,
+ String id,
+ int kind,
+ int official,
+ String fedState,
+ String description,
+ TYPE type
+ ) {
+ super(name);
+ this.geom = geom;
+ this.id = id;
+ this.kind = kind;
+ this.official = official;
+ this.fedState = fedState;
+ this.description = description;
+ this.type = type;
+ }
+
+ public Geometry getGeom() {
+ return geom;
+ }
+
+ public void setGeom(Geometry geom) {
+ this.geom = geom;
+ }
+
+ public String getId() {
+ return id;
+ }
+
+ public void setId(String id) {
+ this.id = id;
+ }
+
+ public int getKind() {
+ return kind;
+ }
+
+ public void setKind(int kind) {
+ this.kind = kind;
+ }
+
+ public boolean isOfficial() {
+ return official == 1;
+ }
+
+ public void setOfficial(boolean o) {
+ this.official = o ? 1 : 0;
+ }
+
+ public String getFedState() {
+ return fedState;
+ }
+
+ public void setFedState(String fedState) {
+ this.fedState = fedState;
+ }
+
+ public String getDescription() {
+ return description;
+ }
+
+ public void setDescription(String description) {
+ this.description = description;
+ }
+
+ public TYPE getType() {
+ return type;
+ }
+
+ public void setType(TYPE type) {
+ this.type = type;
+ }
+
+ public SimpleFeatureType getFeatureType() {
+ int srid = this.geom.getSRID();
+ String srs = "EPSG:" + srid;
+
+ Object[][] attrs = new Object[5][];
+ attrs[0] = new Object[] { "name", String.class };
+ attrs[1] = new Object[] { "description", String.class };
+ attrs[2] = new Object[] { "TYP", String.class };
+ attrs[3] = new Object[] { "fed_state", String.class };
+ attrs[4] = new Object[] { "official", Integer.class };
+ SimpleFeatureType ft =
+ GeometryUtils.buildFeatureType(
+ "hws", srs, this.geom.getClass(), attrs);
+ return ft;
+ }
+
+ public SimpleFeature getFeature() {
+ SimpleFeatureType ft = getFeatureType();
+
+ SimpleFeatureBuilder featureBuilder = new SimpleFeatureBuilder(ft);
+ featureBuilder.add(this.geom);
+ featureBuilder.add(this.name);
+ featureBuilder.add(this.description);
+ if (this.kind == 1) {
+ featureBuilder.add("Rohr 1");
+ }
+ else if (this.kind == 2) {
+ featureBuilder.add("Damm");
+ }
+ else if (this.kind == 3) {
+ featureBuilder.add("Graben");
+ }
+ else {
+ featureBuilder.add("");
+ }
+ featureBuilder.add(this.fedState);
+ featureBuilder.add(this.official);
+
+ return featureBuilder.buildFeature(null);
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/map/HWSContainer.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/map/HWSContainer.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,93 @@
+package de.intevation.flys.artifacts.model.map;
+
+import java.util.ArrayList;
+import java.util.List;
+
+import org.apache.log4j.Logger;
+
+public class HWSContainer
+{
+ private static Logger logger = Logger.getLogger(HWSContainer.class);
+ private String river;
+ private HWS.TYPE type;
+ private List<HWS> hws;
+
+ public HWSContainer() {
+ river = null;
+ hws = new ArrayList<HWS>();
+ }
+
+ public HWSContainer(String river, HWS.TYPE type, List<HWS> hws) {
+ this.river = river;
+ this.hws = hws;
+ this.type = type;
+ }
+
+ public void setRiver(String river) {
+ this.river = river;
+ }
+
+ public String getRiver() {
+ return this.river;
+ }
+
+ public HWS.TYPE getType() {
+ return type;
+ }
+
+ public void setType(HWS.TYPE type) {
+ this.type = type;
+ }
+
+ public List<HWS> getHws() {
+ return hws;
+ }
+
+ public void addHws(HWS hws) {
+ logger.debug("add hws: " + hws.getName());
+ this.hws.add(hws);
+ }
+
+ public void addHws(List<HWS> hws) {
+ this.hws.addAll(hws);
+ }
+
+ public List<HWS> getOfficialHWS() {
+ if (hws == null || hws.size() == 0) {
+ return new ArrayList<HWS>();
+ }
+ List<HWS> results = new ArrayList<HWS>();
+ for (HWS h: hws) {
+ if (h.isOfficial()) {
+ results.add(h);
+ }
+ }
+ return results;
+ }
+
+ public List<HWS> getHws(String name) {
+ logger.debug("find: " + name + " in " + hws.size() + " elements");
+ if (hws == null || hws.size() == 0) {
+ return new ArrayList<HWS>();
+ }
+ List<HWS> results = new ArrayList<HWS>();
+ for (HWS h: hws) {
+ if (h.getName().equals(name)) {
+ results.add(h);
+ }
+ }
+ logger.debug("found: " + results.size());
+ return results;
+ }
+
+ public List<HWS> getHws(List<String> list) {
+ if (hws == null || hws.size() == 0) {
+ return new ArrayList<HWS>();
+ }
+ List<HWS> results = new ArrayList<HWS>();
+ for (String name : list) {
+ results.addAll(getHws(name));
+ }
+ return results;
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/map/HWSFactory.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/map/HWSFactory.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,179 @@
+package de.intevation.flys.artifacts.model.map;
+
+import java.util.List;
+
+import net.sf.ehcache.Cache;
+import net.sf.ehcache.Element;
+
+import org.apache.log4j.Logger;
+import org.hibernate.SQLQuery;
+import org.hibernate.Session;
+import org.hibernate.type.StandardBasicTypes;
+import org.hibernatespatial.GeometryUserType;
+
+import com.vividsolutions.jts.geom.Geometry;
+
+import de.intevation.flys.artifacts.cache.CacheFactory;
+import de.intevation.flys.backend.SessionHolder;
+
+
+public class HWSFactory
+{
+ /** Private logger to use here. */
+ private static Logger log = Logger.getLogger(HWSFactory.class);
+
+ private static final int HWS_LINES = 0;
+ private static final int HWS_POINTS = 1;
+
+ public static final String SQL_SELECT_LINES =
+ "SELECT hl.name, hl.geom, hl.id, hl.kind_id, hl.official, fs.name AS fed, hl.description " +
+ " FROM hws_lines hl" +
+ " JOIN rivers r ON hl.river_id = r.id" +
+ " LEFT JOIN fed_states fs ON hl.fed_state_id = fs.id" +
+ " WHERE r.name = :river";
+
+ public static final String SQL_SELECT_POINTS =
+ "SELECT hp.name, hp.geom, hp.id, hp.kind_id, hp.official, fs.name AS fed, hp.description " +
+ " FROM hws_points hp" +
+ " JOIN rivers r ON hp.river_id = r.id" +
+ " LEFT JOIN fed_states fs ON hp.fed_state_id = fs.id" +
+ " WHERE r.name = :river";
+
+
+ private HWSFactory() {
+ }
+
+
+ public static HWSContainer getHWSLines(String river) {
+ log.debug("HWSFactory.getHWS");
+ Cache cache = CacheFactory.getCache(StaticHWSCacheKey.CACHE_NAME);
+
+ StaticHWSCacheKey cacheKey;
+
+ if (cache != null) {
+ cacheKey = new StaticHWSCacheKey(river, HWS_LINES);
+ Element element = cache.get(cacheKey);
+ if (element != null) {
+ log.debug("Got static hws values from cache");
+ return (HWSContainer)element.getValue();
+ }
+ }
+ else {
+ cacheKey = null;
+ }
+
+ HWSContainer values = getHWSLinesUncached(river);
+
+ if (values != null && cacheKey != null) {
+ log.debug("Store static hws values in cache.");
+ Element element = new Element(cacheKey, values);
+ cache.put(element);
+ }
+ return values;
+ }
+
+ public static HWSContainer getHWSPoints(String river) {
+ log.debug("HWSFactory.getHWS");
+ Cache cache = CacheFactory.getCache(StaticHWSCacheKey.CACHE_NAME);
+
+ StaticHWSCacheKey cacheKey;
+
+ if (cache != null) {
+ cacheKey = new StaticHWSCacheKey(river, HWS_LINES);
+ Element element = cache.get(cacheKey);
+ if (element != null) {
+ log.debug("Got static hws values from cache");
+ return (HWSContainer)element.getValue();
+ }
+ }
+ else {
+ cacheKey = null;
+ }
+
+ HWSContainer values = getHWSPointsUncached(river);
+
+ if (values != null && cacheKey != null) {
+ log.debug("Store static hws values in cache.");
+ Element element = new Element(cacheKey, values);
+ cache.put(element);
+ }
+ return values;
+ }
+
+ private static HWSContainer getHWSLinesUncached(String river) {
+ if (log.isDebugEnabled()) {
+ log.debug("HWSFactory.getHWSLinesUncached");
+ }
+
+ Session session = SessionHolder.HOLDER.get();
+ SQLQuery sqlQuery = null;
+ HWSContainer container = new HWSContainer();
+ container.setRiver(river);
+ container.setType(HWS.TYPE.LINE);
+ sqlQuery = session.createSQLQuery(SQL_SELECT_LINES)
+ .addScalar("name", StandardBasicTypes.STRING)
+ .addScalar("geom", GeometryUserType.TYPE)
+ .addScalar("id", StandardBasicTypes.STRING)
+ .addScalar("kind_id", StandardBasicTypes.INTEGER)
+ .addScalar("official", StandardBasicTypes.INTEGER)
+ .addScalar("fed", StandardBasicTypes.STRING)
+ .addScalar("description", StandardBasicTypes.STRING);
+
+ sqlQuery.setString("river", river);
+ List<Object []> resultsLines = sqlQuery.list();
+
+ for (Object [] row: resultsLines) {
+ container.addHws(
+ new HWS(
+ (String) row[0],
+ (Geometry) row[1],
+ (String) row[2],
+ (Integer) row[3],
+ (Integer) row[4],
+ (String) row[5],
+ (String) row[6],
+ HWS.TYPE.LINE));
+ }
+
+ return container;
+ }
+
+ private static HWSContainer getHWSPointsUncached(String river) {
+ if (log.isDebugEnabled()) {
+ log.debug("HWSFactory.getHWSLinesUncached");
+ }
+
+ Session session = SessionHolder.HOLDER.get();
+ SQLQuery sqlQuery = null;
+ HWSContainer container = new HWSContainer();
+ container.setRiver(river);
+ container.setType(HWS.TYPE.LINE);
+ sqlQuery = session.createSQLQuery(SQL_SELECT_POINTS)
+ .addScalar("name", StandardBasicTypes.STRING)
+ .addScalar("geom", GeometryUserType.TYPE)
+ .addScalar("id", StandardBasicTypes.STRING)
+ .addScalar("kind_id", StandardBasicTypes.INTEGER)
+ .addScalar("official", StandardBasicTypes.INTEGER)
+ .addScalar("fed", StandardBasicTypes.STRING)
+ .addScalar("description", StandardBasicTypes.STRING);
+
+ sqlQuery.setString("river", river);
+ List<Object []> resultsPoints = sqlQuery.list();
+
+ for (int i = 0; i < resultsPoints.size(); i++) {
+ Object[] row = resultsPoints.get(i);
+ container.addHws(
+ new HWS(
+ (String) row[0],
+ (Geometry) row[1],
+ (String) row[2],
+ (Integer) row[3],
+ (Integer) row[4],
+ (String) row[5],
+ (String) row[6],
+ HWS.TYPE.POINT));
+ }
+
+ return container;
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/map/StaticHWSCacheKey.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/map/StaticHWSCacheKey.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,27 @@
+package de.intevation.flys.artifacts.model.map;
+
+
+public class StaticHWSCacheKey
+{
+ public static final String CACHE_NAME = "hws-value-table-static";
+
+ private String river;
+ private int type;
+
+ public StaticHWSCacheKey(String river, int type) {
+ this.river = river;
+ this.type = type;
+ }
+
+ public int hashCode() {
+ return river.hashCode() | (type << 8);
+ }
+
+ public boolean equals(Object other) {
+ if (!(other instanceof StaticHWSCacheKey)) {
+ return false;
+ }
+ StaticHWSCacheKey o = (StaticHWSCacheKey) other;
+ return this.river == o.river;
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/minfo/QualityMeasurement.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/minfo/QualityMeasurement.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/minfo/QualityMeasurement.java Fri Mar 22 11:25:54 2013 +0100
@@ -76,5 +76,4 @@
public void setDepth2(double depth2) {
this.depth2 = depth2;
}
-
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/sq/MeasurementFactory.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/sq/MeasurementFactory.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/model/sq/MeasurementFactory.java Fri Mar 22 11:25:54 2013 +0100
@@ -434,7 +434,7 @@
if (debug) {
log.debug("fractions - s: " +
sandF + " c: " +
- coarseF + " g: " +
+ coarseF + " g: " +
gravelF);
log.debug("scale: " + scale + " = " + effWidth + " * " + gt);
}
@@ -453,7 +453,7 @@
if (debug) {
log.debug(
"BL_S: " + m.get("BL_S") +
- " BL_G: " + m.get("BL_G") +
+ " BL_G: " + m.get("BL_G") +
" BL_C: " + m.get("BL_C"));
}
return m;
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/services/AbstractChartService.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/services/AbstractChartService.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/services/AbstractChartService.java Fri Mar 22 11:25:54 2013 +0100
@@ -19,7 +19,7 @@
import de.intevation.artifacts.GlobalContext;
import de.intevation.artifacts.Service;
-
+/** Serve chart. */
public abstract class AbstractChartService extends DefaultService {
public static final int DEFAULT_WIDTH = 240;
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/services/DischargeTablesOverview.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/services/DischargeTablesOverview.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/services/DischargeTablesOverview.java Fri Mar 22 11:25:54 2013 +0100
@@ -13,6 +13,7 @@
import org.hibernate.Session;
import org.jfree.chart.ChartFactory;
import org.jfree.chart.JFreeChart;
+import org.jfree.chart.plot.Marker;
import org.jfree.chart.plot.PlotOrientation;
import org.jfree.chart.plot.XYPlot;
import org.jfree.data.xy.XYSeries;
@@ -24,14 +25,17 @@
import de.intevation.artifacts.CallMeta;
import de.intevation.artifacts.GlobalContext;
import de.intevation.flys.artifacts.model.DischargeTables;
+import de.intevation.flys.artifacts.model.GaugeRange;
import de.intevation.flys.artifacts.model.GaugesFactory;
import de.intevation.flys.artifacts.resources.Resources;
import de.intevation.flys.backend.SessionHolder;
import de.intevation.flys.model.DischargeTable;
import de.intevation.flys.model.Gauge;
+import de.intevation.flys.model.MainValue;
import de.intevation.flys.model.TimeInterval;
+/** Generate Discharge Table chart. */
public class DischargeTablesOverview extends AbstractChartService {
private static final Logger log = Logger
@@ -123,6 +127,11 @@
plot.setDomainGridlinesVisible(true);
plot.setRangeGridlinesVisible(true);
+ applyMainValueMarkers(
+ plot,
+ gauge,
+ callMeta);
+
return chart;
}
@@ -148,6 +157,36 @@
return series;
}
+
+ /** Add domain markers to plot that indicate mainvalues. */
+ protected static void applyMainValueMarkers(
+ XYPlot plot,
+ Gauge gauge,
+ CallMeta meta
+ ) {
+ String river = gauge.getRiver().getName();
+ double km = gauge.getStation().doubleValue();
+
+ // Get Gauge s mainvalues.
+ List<MainValue> mainValues = gauge.getMainValues();
+ for (MainValue mainValue : mainValues) {
+ if (mainValue.getMainValue().getType().getName().equals("Q")) {
+ // Its a Q main value.
+ Marker m = FixingsKMChartService.createQSectorMarker(
+ mainValue.getValue().doubleValue(),
+ mainValue.getMainValue().getName());
+ plot.addDomainMarker(m);
+ }
+ else if (mainValue.getMainValue().getType().getName().equals("W")) {
+ // Its a W main value.
+ Marker m = FixingsKMChartService.createQSectorMarker(
+ mainValue.getValue().doubleValue(),
+ mainValue.getMainValue().getName());
+ plot.addRangeMarker(m);
+ }
+ }
+ }
+
protected String createSeriesTitle(CallMeta callMeta, DischargeTable dt)
throws IllegalArgumentException {
TimeInterval timeInterval = dt.getTimeInterval();
@@ -259,3 +298,4 @@
return dts;
}
}
+// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/DGMSelect.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/DGMSelect.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/DGMSelect.java Fri Mar 22 11:25:54 2013 +0100
@@ -89,8 +89,8 @@
throw new IllegalArgumentException(ERR_INVALID_DGM);
}
- double l = dgm.getLower().doubleValue();
- double u = dgm.getUpper().doubleValue();
+ double l = dgm.getRange().getA().doubleValue();
+ double u = dgm.getRange().getB().doubleValue();
double[] range = FLYSUtils.getKmFromTo(flys);
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/DischargeLongitudinalSection.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/DischargeLongitudinalSection.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/DischargeLongitudinalSection.java Fri Mar 22 11:25:54 2013 +0100
@@ -1,7 +1,9 @@
package de.intevation.flys.artifacts.states;
import de.intevation.artifactdatabase.state.Facet;
+import de.intevation.artifactdatabase.state.FacetActivity;
+import de.intevation.artifacts.Artifact;
import de.intevation.artifacts.CallContext;
import de.intevation.flys.artifacts.ChartArtifact;
@@ -32,6 +34,26 @@
private static Logger log =
Logger.getLogger(DischargeLongitudinalSection.class);
+ static {
+ // Active/deactivate facets.
+ FacetActivity.Registry.getInstance().register(
+ "winfo",
+ new FacetActivity() {
+ @Override
+ public Boolean isInitialActive(
+ Artifact artifact,
+ Facet facet,
+ String output
+ ) {
+ String name = facet.getName();
+ if (name.equals(DISCHARGE_LONGITUDINAL_Q_INFOLD_CUT)) {
+ return Boolean.FALSE;
+ }
+ return Boolean.TRUE;
+ }
+ });
+ }
+
@Override
public Object computeAdvance(
FLYSArtifact artifact,
@@ -71,7 +93,7 @@
nameQ = "Q(" + nameW + ")";
}
- // Do not generate Waterlevel/Waterline facets
+ // Do not generate Waterlevel/Waterline facets
// for Q only curves.
if (!(wqkms[i] instanceof ConstantWQKms)) {
@@ -80,14 +102,24 @@
Facet s = new CrossSectionWaterLineFacet(i, nameW);
+ Facet q = new WaterlevelFacet(
+ i, DISCHARGE_LONGITUDINAL_Q, nameQ);
facets.add(s);
facets.add(w);
+ facets.add(q);
}
-
- Facet q = new WaterlevelFacet(
- i, DISCHARGE_LONGITUDINAL_Q, nameQ);
-
- facets.add(q);
+ else {
+ Facet q;
+ if (nameQ.contains("geschnitten")) {
+ q = new WaterlevelFacet(
+ i, DISCHARGE_LONGITUDINAL_Q_INFOLD_CUT, nameQ);
+ }
+ else {
+ q = new WaterlevelFacet(
+ i, DISCHARGE_LONGITUDINAL_Q_INFOLD, nameQ);
+ }
+ facets.add(q);
+ }
if (wqkms[i] instanceof WQCKms) {
// TODO DO i18n
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/FloodMapState.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/FloodMapState.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/FloodMapState.java Fri Mar 22 11:25:54 2013 +0100
@@ -17,7 +17,11 @@
import de.intevation.flys.artifacts.model.CalculationMessage;
import de.intevation.flys.artifacts.model.CalculationResult;
import de.intevation.flys.artifacts.model.FacetTypes;
+import de.intevation.flys.artifacts.model.LayerInfo;
import de.intevation.flys.artifacts.model.WQKms;
+import de.intevation.flys.artifacts.model.map.HWS;
+import de.intevation.flys.artifacts.model.map.HWSContainer;
+import de.intevation.flys.artifacts.model.map.HWSFactory;
import de.intevation.flys.artifacts.model.map.WMSLayerFacet;
import de.intevation.flys.artifacts.model.map.WSPLGENCalculation;
import de.intevation.flys.artifacts.model.map.WSPLGENJob;
@@ -42,16 +46,18 @@
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.ArrayList;
+import java.util.Arrays;
import java.util.List;
import org.apache.log4j.Logger;
+import org.apache.velocity.Template;
import org.geotools.feature.FeatureCollection;
import org.geotools.feature.FeatureCollections;
import org.geotools.feature.simple.SimpleFeatureBuilder;
+import org.hibernate.HibernateException;
import org.opengis.feature.simple.SimpleFeature;
import org.opengis.feature.simple.SimpleFeatureType;
-
public class FloodMapState
extends DefaultState
implements FacetTypes
@@ -86,6 +92,13 @@
public static final int WSPLGEN_DEFAULT_OUTPUT = 0;
+ private static final String HWS_LINES_SHAPE = "hws-lines.shp";
+
+ private static final String I18N_HWS_POINTS = "floodmap.hws.points";
+ private static final String I18N_HWS_LINES = "floodmap.hws.lines";
+ private static final String HWS_LINES = "hws-lines";
+ private static final String HWS_POINT_SHAPE = "hws-points.shp";
+ private static final String HWS_POINTS = "hws-points";
/**
* @param orig
@@ -289,6 +302,7 @@
WSPLGENCalculation calculation
) {
logger.debug("FloodMapState.prepareWSPLGENJob");
+ String scenario = artifact.getDataAsString("scenario");
WSPLGENJob job = new WSPLGENJob(
artifact,
@@ -304,15 +318,20 @@
setDelta(artifact, job);
setGel(artifact, job);
setDist(artifact, job);
- setLine(artifact, facetCreator, artifactDir, job);
- setUserShape(artifact, facetCreator, artifactDir, job);
setAxis(artifact, artifactDir, job);
setPro(artifact, artifactDir, job);
- setDgm(artifact, job);
+ setDgm(artifact, job, context);
setArea(artifact, artifactDir, job);
setOutFile(artifact, job);
setWsp(artifact, context, artifactDir, job); // WSP
-
+ if (scenario.equals("scenario.current")) {
+ setOfficialHWS(artifact, facetCreator, artifactDir, job);
+ }
+ else if (scenario.equals("scenario.scenario")) {
+ setAdditionalHWS(artifact, facetCreator, artifactDir, job);
+ setLine(artifact, facetCreator, artifactDir, job);
+ setUserShape(artifact, facetCreator, artifactDir, job);
+ }
// TODO
// setWspTag(artifact, job);
@@ -332,6 +351,106 @@
}
+ private void setAdditionalHWS(
+ FLYSArtifact artifact,
+ FacetCreator facetCreator,
+ File dir,
+ WSPLGENJob job) {
+ File line = new File(dir, HWS_LINES_SHAPE);
+ boolean lines = line.exists();
+ logger.debug("shp file exists: " + lines);
+ if (lines) {
+ job.addLin(dir + "/" + HWS_LINES_SHAPE);
+ facetCreator.createShapeFacet(I18N_HWS_LINES,
+ MapfileGenerator.MS_LAYER_PREFIX + HWS_LINES,
+ FLOODMAP_LINES, 2);
+ }
+ File point = new File(dir, HWS_POINT_SHAPE);
+ boolean points = point.exists();
+ logger.debug("shp file exists: " + points);
+ if (points) {
+ facetCreator.createShapeFacet(I18N_HWS_POINTS,
+ MapfileGenerator.MS_LAYER_PREFIX + HWS_POINTS,
+ FLOODMAP_FIXPOINTS, 3);
+ }
+ }
+
+
+ private void setOfficialHWS(
+ FLYSArtifact artifact,
+ FacetCreator facetCreator,
+ File artifactDir,
+ WSPLGENJob job) {
+ String river = artifact.getDataAsString("river");
+
+ HWSContainer hwsLines = HWSFactory.getHWSLines(river);
+ List<HWS> selectedLines = hwsLines.getOfficialHWS();
+
+ FeatureCollection collectionLines = FeatureCollections.newCollection();
+ SimpleFeatureType lineType = null;
+ for (HWS h : selectedLines) {
+ lineType = h.getFeatureType();
+ collectionLines.add(h.getFeature());
+ }
+ boolean successLines = false;
+ if (lineType != null && collectionLines.size() > 0) {
+ File shapeLines = new File(artifactDir, HWS_LINES_SHAPE);
+ successLines = GeometryUtils.writeShapefile(
+ shapeLines, lineType, collectionLines);
+ }
+ if (successLines) {
+ createMapfile(
+ artifact,
+ artifactDir,
+ MapfileGenerator.MS_LAYER_PREFIX + "hws-lines",
+ HWS_LINES_SHAPE,
+ "LINE",
+ "31467",
+ "hws");
+ job.addLin(artifactDir + "/" + HWS_LINES_SHAPE);
+ facetCreator.createShapeFacet(I18N_HWS_LINES,
+ MapfileGenerator.MS_LAYER_PREFIX + HWS_LINES,
+ FLOODMAP_HWS_LINES,2);
+ }
+ }
+
+
+ private void createMapfile(
+ FLYSArtifact artifact,
+ File artifactDir,
+ String name,
+ String hwsShapefile,
+ String type,
+ String srid,
+ String group
+ ) {
+ LayerInfo info = new LayerInfo();
+ info.setName(name + artifact.identifier());
+ info.setType(type);
+ info.setDirectory(artifact.identifier());
+ info.setTitle(name);
+ info.setData(hwsShapefile);
+ info.setSrid(srid);
+ info.setGroupTitle(group);
+ MapfileGenerator generator = new ArtifactMapfileGenerator();
+ Template tpl = generator.getTemplateByName(MapfileGenerator.SHP_LAYER_TEMPLATE);
+ try {
+ File layer = new File(artifactDir.getCanonicalPath() + "/" + name);
+ generator.writeLayer(info, layer, tpl);
+ List<String> layers = new ArrayList<String>();
+ layers.add(layer.getAbsolutePath());
+ generator.generate();
+ }
+ catch(FileNotFoundException fnfe) {
+ logger.warn("Could not find mapfile for hws layer");
+ }
+ catch (Exception ioe) {
+ logger.warn("Could not create mapfile for hws layer");
+ logger.warn(Arrays.toString(ioe.getStackTrace()));
+ }
+ }
+
+
protected void setOut(FLYSArtifact artifact, WSPLGENJob job) {
job.setOut(WSPLGEN_DEFAULT_OUTPUT);
}
@@ -448,6 +567,14 @@
logger.debug(
"Successfully created barrier line shapefile. " +
"Write shapefile path into WSPLGEN job.");
+ createMapfile(
+ artifact,
+ dir,
+ MapfileGenerator.MS_LAYER_PREFIX + "barriers-lines",
+ WSPLGEN_BARRIERS_LINES,
+ "LINE",
+ srid,
+ "barriers");
if (scenario.equals(WSPLGENJob.GEL_NOSPERRE)) {
logger.debug("WSPLGEN will not use barrier features.");
@@ -462,10 +589,19 @@
GeometryUtils.buildFeatureType("polygons", srs, Polygon.class, obj),
fcs[1]);
+
if (p) {
logger.debug(
"Successfully created barrier polygon shapefile. " +
"Write shapefile path into WSPLGEN job.");
+ createMapfile(
+ artifact,
+ dir,
+ MapfileGenerator.MS_LAYER_PREFIX + "barriers-poly",
+ shapePolys.getAbsolutePath(),
+ "POLYGON",
+ srid,
+ "barriers");
if (scenario.equals(WSPLGENJob.GEL_NOSPERRE)) {
logger.debug("WSPLGEN will not use barrier features.");
@@ -487,14 +623,17 @@
File dir,
WSPLGENJob job
) {
- File archive = new File(dir, WSPLGEN_USER_ZIP);
+ File archive = new File(dir, WSPLGEN_USER_SHAPE);
boolean exists = archive.exists();
- logger.debug("Zip file exists: " + exists);
+ logger.debug("shp file exists: " + exists);
if (exists) {
FileUtils.extractZipfile(archive, dir);
job.addLin(dir + "/" + WSPLGEN_USER_SHAPE);
- facetCreator.createUserShapeFacet();
+ facetCreator.createShapeFacet(FacetCreator.I18N_USERSHAPE,
+ MapfileGenerator.MS_LAYER_PREFIX + "user-rgd",
+ FLOODMAP_USERSHAPE,
+ 4);
}
}
@@ -606,8 +745,15 @@
String river = artifact.getDataAsString("river");
String srid = FLYSUtils.getRiverDGMSrid(river);
String srs = "EPSG:" + srid;
-logger.debug("srs: " + srs);
- List<RiverAxis> axes = RiverAxis.getRiverAxis(river);
+
+ List<RiverAxis> axes = null;
+ try {
+ axes = RiverAxis.getRiverAxis(river);
+ }
+ catch (HibernateException iae) {
+ logger.warn("No valid river axis found for " + river);
+ return;
+ }
if (axes == null || axes.isEmpty()) {
logger.warn("Could not find river axis for: '" + river + "'");
return;
@@ -684,7 +830,11 @@
}
- protected void setDgm(FLYSArtifact artifact, WSPLGENJob job) {
+ protected void setDgm(
+ FLYSArtifact artifact,
+ WSPLGENJob job,
+ CallContext context
+ ) {
String dgm_id = artifact.getDataAsString("dgm");
int id = -1;
@@ -701,7 +851,15 @@
return;
}
- job.setDgm(dgm.getPath());
+ File dgmPath = new File (dgm.getPath());
+ if (dgmPath.isAbsolute()) {
+ job.setDgm(dgm.getPath());
+ }
+ else {
+ FLYSContext fc = (FLYSContext)context.globalContext();
+ String prefix = (String) fc.get("dgm-path");
+ job.setDgm(prefix.trim() + dgm.getPath().trim());
+ }
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/HWSBarriersState.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/HWSBarriersState.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,359 @@
+package de.intevation.flys.artifacts.states;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.IOException;
+
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.List;
+
+import org.apache.log4j.Logger;
+
+import org.apache.velocity.Template;
+
+import org.geotools.data.shapefile.ShapefileDataStore;
+
+import org.geotools.feature.FeatureCollection;
+import org.geotools.feature.FeatureCollections;
+
+import org.opengis.feature.simple.SimpleFeatureType;
+
+import org.opengis.feature.type.GeometryDescriptor;
+
+import org.w3c.dom.Element;
+
+import de.intevation.artifactdatabase.state.Facet;
+
+import de.intevation.artifacts.Artifact;
+import de.intevation.artifacts.CallContext;
+
+import de.intevation.artifacts.common.utils.FileTools;
+
+import de.intevation.artifacts.common.utils.XMLUtils.ElementCreator;
+
+import de.intevation.flys.artifacts.FLYSArtifact;
+
+import de.intevation.flys.artifacts.access.MapAccess;
+
+import de.intevation.flys.artifacts.model.LayerInfo;
+
+import de.intevation.flys.artifacts.model.map.HWS;
+import de.intevation.flys.artifacts.model.map.HWSContainer;
+import de.intevation.flys.artifacts.model.map.HWSFactory;
+
+import de.intevation.flys.utils.ArtifactMapfileGenerator;
+import de.intevation.flys.utils.FLYSUtils;
+import de.intevation.flys.utils.GeometryUtils;
+import de.intevation.flys.utils.MapfileGenerator;
+
+public class HWSBarriersState
+extends DefaultState
+{
+
+ /** The logger that is used in this class.*/
+ private static Logger logger = Logger.getLogger(HWSBarriersState.class);
+ private static final String HWS_SHAPEFILE_LINES = "hws-lines.shp";
+ private static final String HWS_SHAPEFILE_POINTS = "hws-points.shp";
+ private static final String USER_RGD_SHAPE = "user-rgd.shp";
+ private static final String USER_RGD_ZIP = "user-rgd.zip";
+ private static final String USER_RGD_FILENAME = "user-rgd";
+ @Override
+ protected String getUIProvider() {
+ return "map_digitize";
+ }
+
+
+ @Override
+ protected Element createStaticData(
+ FLYSArtifact flys,
+ ElementCreator creator,
+ CallContext cc,
+ String name,
+ String value,
+ String type
+ ) {
+ Element dataElement = creator.create("data");
+ creator.addAttr(dataElement, "name", name, true);
+ creator.addAttr(dataElement, "type", type, true);
+
+ Element itemElement = creator.create("item");
+ creator.addAttr(itemElement, "value", value, true);
+
+ creator.addAttr(itemElement, "label", "", true);
+ dataElement.appendChild(itemElement);
+
+ return dataElement;
+ }
+
+
+ @Override
+ public Object computeAdvance(
+ FLYSArtifact artifact,
+ String hash,
+ CallContext context,
+ List<Facet> facets,
+ Object old
+ ) {
+ File artifactDir = getDirectory(artifact);
+
+ if (artifactDir == null) {
+ logger.error("Could not create directory for HWS shapefile!");
+ return null;
+ }
+
+ MapAccess access = new MapAccess(artifact, context);
+ String river = access.getRiver();
+ HWSContainer hwsLines = HWSFactory.getHWSLines(river);
+ HWSContainer hwsPoints = HWSFactory.getHWSPoints(river);
+ List<String> selected = access.getHWS();
+
+ List<HWS> selectedLines = hwsLines.getHws(selected);
+ List<HWS> selectedPoints = hwsPoints.getHws(selected);
+
+ FeatureCollection collectionLines = FeatureCollections.newCollection();
+ SimpleFeatureType lineType = null;
+ for (HWS h : selectedLines) {
+ lineType = h.getFeatureType();
+ collectionLines.add(h.getFeature());
+ }
+ boolean successLines = false;
+ if (lineType != null && collectionLines.size() > 0) {
+ File shapeLines = new File(artifactDir, HWS_SHAPEFILE_LINES);
+ successLines = GeometryUtils.writeShapefile(
+ shapeLines, lineType, collectionLines);
+ }
+
+ FeatureCollection collectionPoints = FeatureCollections.newCollection();
+ SimpleFeatureType pointType = null;
+ for (HWS h : selectedPoints) {
+ pointType = h.getFeatureType();
+ collectionPoints.add(h.getFeature());
+ }
+ boolean successPoints = false;
+ if (pointType != null && collectionPoints.size() > 0) {
+ File shapePoints = new File(artifactDir, HWS_SHAPEFILE_POINTS);
+ successPoints =GeometryUtils.writeShapefile(
+ shapePoints, pointType, collectionPoints);
+ }
+
+ if (successLines) {
+ createMapfile(
+ artifact,
+ artifactDir,
+ MapfileGenerator.MS_LAYER_PREFIX + "hws-lines",
+ HWS_SHAPEFILE_LINES,
+ "LINE",
+ "31467",
+ "hws");
+ }
+ if (successPoints) {
+ createMapfile(
+ artifact,
+ artifactDir,
+ MapfileGenerator.MS_LAYER_PREFIX + "hws-points",
+ HWS_SHAPEFILE_POINTS,
+ "POINT",
+ "31467",
+ "hws");
+ }
+
+ String userRgd = artifact.getDataAsString("uesk.user-rgd");
+ if (!userRgd.equals("none")) {
+ if (extractUserShp(artifactDir)) {
+ try {
+ ShapefileDataStore store = new ShapefileDataStore(
+ new File(artifactDir.getCanonicalPath() +
+ "/" + USER_RGD_SHAPE)
+ .toURI().toURL());
+ GeometryDescriptor desc =
+ store.getSchema().getGeometryDescriptor();
+ String type = desc.getType().getName().toString();
+ String proj =
+ desc.getCoordinateReferenceSystem().
+ getCoordinateSystem().toString();
+ int pos1 = proj.indexOf("EPSG\",\"");
+ int pos2 = proj.indexOf("\"]]");
+ String epsg = "";
+ if (pos1 >= 0 && pos2 >= 0) {
+ epsg =
+ proj.substring(proj.indexOf("EPSG\",\"") + 7,
+ proj.indexOf("\"]]"));
+ }
+ else {
+ logger.warn("Could not read EPSG code from shapefile.");
+ return null;
+ }
+ if (type.contains("Line")) {
+ type = "LINE";
+ }
+ else if (type.contains("Poly")) {
+ type = "POLYON";
+ }
+ else {
+ type = "POINT";
+ }
+ createMapfile(
+ artifact,
+ artifactDir,
+ MapfileGenerator.MS_LAYER_PREFIX + USER_RGD_FILENAME,
+ USER_RGD_SHAPE,
+ type,
+ epsg,
+ "user-rgd");
+ }
+ catch (IOException e) {
+ logger.warn("No mapfile for user-rgd created!");
+ }
+ }
+ }
+ return null;
+ }
+
+ private boolean extractUserShp(File dir) {
+ // TODO Auto-generated method stub
+ File archive = new File(dir, USER_RGD_ZIP);
+ boolean exists = archive.exists();
+ logger.debug("Zip file exists: " + exists);
+ if (exists) {
+ try {
+ File tmpDir = new File(dir, "usr_tmp");
+ FileTools.extractArchive(archive, tmpDir);
+ moveFiles(tmpDir, dir);
+ return true;
+ }
+ catch (IOException ioe) {
+ logger.warn("Zip archive " + dir + "/"
+ + USER_RGD_ZIP + " could not be extracted.");
+ return false;
+ }
+ }
+ return false;
+ }
+
+ private void moveFiles(File source, final File target)
+ throws IOException
+ {
+ if (!source.exists()) {
+ return;
+ }
+ if (!target.exists()) {
+ target.mkdir();
+ }
+ FileTools.walkTree(source, new FileTools.FileVisitor() {
+ @Override
+ public boolean visit(File file) {
+ if (!file.isDirectory()) {
+ String name = file.getName();
+ String suffix = "";
+ int pos = name.lastIndexOf('.');
+ if (pos > 0 && pos < name.length() - 1) {
+ suffix = name.substring(pos + 1);
+ }
+ else {
+ return true;
+ }
+ try {
+ FileTools.copyFile(file, new File(target, USER_RGD_FILENAME + "." + suffix));
+ }
+ catch (IOException ioe) {
+ logger.warn ("Error while copying file " + file.getName());
+ return true;
+ }
+ }
+ return true;
+ }
+ });
+
+ FileTools.deleteRecursive(source);
+ }
+
+ private void createMapfile(
+ FLYSArtifact artifact,
+ File artifactDir,
+ String name,
+ String hwsShapefile,
+ String type,
+ String srid,
+ String group
+ ) {
+ LayerInfo info = new LayerInfo();
+ info.setName(name + artifact.identifier());
+ info.setType(type);
+ info.setDirectory(artifact.identifier());
+ info.setTitle(name);
+ info.setData(hwsShapefile);
+ info.setSrid(srid);
+ info.setGroupTitle(group);
+ MapfileGenerator generator = new ArtifactMapfileGenerator();
+ Template tpl = generator.getTemplateByName(MapfileGenerator.SHP_LAYER_TEMPLATE);
+ try {
+ File layer = new File(artifactDir.getCanonicalPath() + "/" + name);
+ generator.writeLayer(info, layer, tpl);
+ List<String> layers = new ArrayList<String>();
+ layers.add(layer.getAbsolutePath());
+ generator.generate();
+ }
+ catch(FileNotFoundException fnfe) {
+ logger.warn("Could not find mapfile for hws layer");
+ }
+ catch (Exception ioe) {
+ logger.warn("Could not create mapfile for hws layer");
+ logger.warn(Arrays.toString(ioe.getStackTrace()));
+ }
+ }
+
+
+ @Override
+ public void endOfLife(Artifact artifact, Object callContext) {
+ super.endOfLife(artifact, callContext);
+ logger.info("ScenarioSelect.endOfLife: " + artifact.identifier());
+
+ FLYSArtifact flys = (FLYSArtifact) artifact;
+ removeDirectory(flys);
+ }
+
+
+ /**
+ * Removes the directory and all its content where the required data and the
+ * results of WSPLGEN are stored. Should be called in endOfLife().
+ */
+ // FIXME: I've seen this code somewhere else...
+ protected void removeDirectory(FLYSArtifact artifact) {
+ String shapePath = FLYSUtils.getXPathString(
+ FLYSUtils.XPATH_FLOODMAP_SHAPEFILE_DIR);
+
+ File artifactDir = new File(shapePath, artifact.identifier());
+
+ if (artifactDir.exists()) {
+ logger.debug("Delete directory: " + artifactDir.getAbsolutePath());
+ boolean success = FileTools.deleteRecursive(artifactDir);
+ if (!success) {
+ logger.warn("could not remove dir '" + artifactDir + "'");
+ }
+ }
+ else {
+ logger.debug("There is no directory to remove.");
+ }
+ }
+
+ /**
+ * Returns (and creates if not existing) the directory for storing WSPLEN
+ * data for the owner artifact.
+ *
+ * @param artifact The owner Artifact.
+ *
+ * @return the directory for WSPLEN data.
+ */
+ protected File getDirectory(FLYSArtifact artifact) {
+ String shapePath = FLYSUtils.getXPathString(
+ FLYSUtils.XPATH_FLOODMAP_SHAPEFILE_DIR);
+
+ File artifactDir = FileTools.getDirectory(
+ shapePath, artifact.identifier());
+
+ return artifactDir;
+ }
+
+}
+// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf-8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/HWSDatacageState.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/HWSDatacageState.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,90 @@
+package de.intevation.flys.artifacts.states;
+
+import org.apache.log4j.Logger;
+import org.w3c.dom.Element;
+
+import de.intevation.artifacts.Artifact;
+import de.intevation.artifacts.CallContext;
+import de.intevation.artifacts.common.utils.XMLUtils.ElementCreator;
+import de.intevation.flys.artifacts.FLYSArtifact;
+
+
+public class HWSDatacageState
+extends DefaultState
+{
+
+ private static final Logger logger = Logger.getLogger(HWSDatacageState.class);
+
+ @Override
+ protected String getUIProvider() {
+ return "hws_datacage_panel";
+ }
+
+
+ @Override
+ protected Element createStaticData(
+ FLYSArtifact flys,
+ ElementCreator creator,
+ CallContext cc,
+ String name,
+ String value,
+ String type
+ ) {
+ Element dataElement = creator.create("data");
+ creator.addAttr(dataElement, "name", name, true);
+ creator.addAttr(dataElement, "type", type, true);
+
+ Element itemElement = creator.create("item");
+ creator.addAttr(itemElement, "value", value, true);
+
+ creator.addAttr(itemElement, "label", getLabel(cc, value), true);
+ dataElement.appendChild(itemElement);
+
+ return dataElement;
+ }
+
+
+ public static String getLabel(CallContext cc, String value) {
+ logger.debug("Create label for value: " + value);
+
+ return value;
+ }
+
+
+ @Override
+ public boolean validate(Artifact artifact)
+ throws IllegalArgumentException
+ {
+ FLYSArtifact flys = (FLYSArtifact) artifact;
+ String hws = flys.getDataAsString("uesk.hws");
+ logger.debug("hws: " + hws);
+ return true;
+ }
+
+
+ /**
+ * Returns the DGM specified in the parameters of <i>flys</i>.
+ *
+ * @param flys The FLYSArtifact that knows the ID of a DGM.
+ *
+ * @throws IllegalArgumentException If the FLYSArtifact doesn't know the ID
+ * of a DGM.
+ *
+ * @return the DGM specified by FLYSArtifact's parameters.
+ */
+ public static String getHWS(FLYSArtifact flys)
+ throws IllegalArgumentException
+ {
+ String hws= flys.getDataAsString("uesk.hws");
+ if (hws == null) {
+ return null;
+ }
+
+ logger.debug("Found selected hws: '" + hws + "'");
+
+ return hws;
+ }
+
+
+
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/HistoricalDischargeComputeState.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/HistoricalDischargeComputeState.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/HistoricalDischargeComputeState.java Fri Mar 22 11:25:54 2013 +0100
@@ -111,7 +111,7 @@
logger
.debug("Create another facet for historical differences.");
- // TODO CREATE BETTER TITLE FOR FACETS
+ // TODO CREATE BETTER TITLE FOR FACETS (issue1180)
facets.add(new HistoricalDischargeDifferenceFacet(i,
HISTORICAL_DISCHARGE_Q_DIFF, "DIFF: " + wqt.getName()));
}
@@ -170,6 +170,7 @@
}
}
+ /** Create title for a Discharge-curve, including date. */
protected String createDischargeCurveTitle(CallContext cc,
HistoricalWQKms wqkms) {
TimeInterval timeInterval = wqkms.getTimeInterval();
@@ -185,6 +186,8 @@
}
}
+
+ /** Create string for facets name/description. */
protected String createFacetTitle(WQTimerange wqt) {
String name = wqt.getName();
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/HistoricalDischargeState.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/HistoricalDischargeState.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/HistoricalDischargeState.java Fri Mar 22 11:25:54 2013 +0100
@@ -22,7 +22,6 @@
private static final Logger logger =
Logger.getLogger(HistoricalDischargeState.class);
-
public static final String I18N_MODE_W = "historical.mode.w";
public static final String I18N_MODE_Q = "historical.mode.q";
@@ -37,6 +36,7 @@
return "wq_simple_array";
}
+ @Override
protected void appendItems(
Artifact artifact,
ElementCreator creator,
@@ -83,6 +83,7 @@
}
+ /** Get label for display in client, depending on chosen W or Q input. */
@Override
protected String getLabelFor(
CallContext cc,
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/InputDoubleState.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/InputDoubleState.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/InputDoubleState.java Fri Mar 22 11:25:54 2013 +0100
@@ -6,7 +6,7 @@
/**
- * State to keep a double value and validate it against a range
+ * State to keep a double value and validate it against a range.
*/
public class InputDoubleState extends MinMaxState {
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/ScenarioSelect.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/ScenarioSelect.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/ScenarioSelect.java Fri Mar 22 11:25:54 2013 +0100
@@ -26,7 +26,6 @@
public static final String FIELD_MODE = "scenario";
- public static final String FIELD_BARRIERS = "uesk.barriers";
public static final String SCENARIO_CURRENT = "scenario.current";
public static final String SCENARIO_POTENTIEL = "scenario.potentiel";
@@ -38,30 +37,11 @@
SCENARIO_SCENRAIO };
-
@Override
protected String getUIProvider() {
- return "map_digitize";
+ return "";
}
-
- @Override
- protected void appendStaticData(
- FLYSArtifact flys,
- CallContext cc,
- ElementCreator creator,
- Element ui,
- String name
- ) {
- if (name != null && name.equals(FIELD_BARRIERS)) {
- return;
- }
- else {
- super.appendStaticData(flys, cc, creator, ui, name);
- }
- }
-
-
@Override
protected Element[] createItems(
XMLUtils.ElementCreator cr,
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/UserRGDState.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/UserRGDState.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,46 @@
+package de.intevation.flys.artifacts.states;
+
+import org.w3c.dom.Element;
+
+import de.intevation.artifacts.CallContext;
+import de.intevation.artifacts.common.utils.XMLUtils.ElementCreator;
+import de.intevation.flys.artifacts.FLYSArtifact;
+
+
+public class UserRGDState
+extends DefaultState
+{
+ @Override
+ protected String getUIProvider() {
+ return "user_rgd_panel";
+ }
+
+ @Override
+ protected Element createStaticData(
+ FLYSArtifact flys,
+ ElementCreator creator,
+ CallContext cc,
+ String name,
+ String value,
+ String type
+ ) {
+ Element dataElement = creator.create("data");
+ creator.addAttr(dataElement, "name", name, true);
+ creator.addAttr(dataElement, "type", type, true);
+
+ Element itemElement = creator.create("item");
+ creator.addAttr(itemElement, "value", value, true);
+
+ creator.addAttr(itemElement, "label", getLabel(cc, value), true);
+ dataElement.appendChild(itemElement);
+
+ return dataElement;
+ }
+
+ public static String getLabel(CallContext cc, String value) {
+
+ return value;
+ }
+
+
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/WQAdapted.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/WQAdapted.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/artifacts/states/WQAdapted.java Fri Mar 22 11:25:54 2013 +0100
@@ -59,6 +59,7 @@
public static final GaugeOrder GAUGE_UP = new GaugeOrder(true);
public static final GaugeOrder GAUGE_DOWN = new GaugeOrder(false);
+
/** Trivial, empty constructor. */
public WQAdapted() {
}
@@ -117,6 +118,7 @@
}
+ /** Create the items for input to the ranges per mode. */
protected Element[] createValueItems(
XMLUtils.ElementCreator cr,
Artifact artifact,
@@ -165,7 +167,7 @@
double[] mmW = gauge.determineMinMaxW();
elements.add(createItem(
- cr, new String[] { from + ";" + to, ""}, mmQ, mmW));
+ cr, new String[] { from + ";" + to, gauge.getName()}, mmQ, mmW));
}
}
else {
@@ -186,7 +188,7 @@
double[] mmW = gauge.determineMinMaxW();
elements.add(createItem(
- cr, new String[] { to + ";" + from, ""}, mmQ, mmW));
+ cr, new String[] { to + ";" + from, gauge.getName()}, mmQ, mmW));
}
}
@@ -200,6 +202,7 @@
}
+ /** In obj: 0 is label, 1 is value. */
protected Element createItem(
XMLUtils.ElementCreator cr,
Object obj,
@@ -449,7 +452,7 @@
double lower = Double.parseDouble(parts[0]);
double upper = Double.parseDouble(parts[1]);
- String[] values = parts[2].split(",");
+ String[] values = parts[3].split(",");
int num = values.length;
double[] res = new double[num];
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/collections/AttributeWriter.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/collections/AttributeWriter.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/collections/AttributeWriter.java Fri Mar 22 11:25:54 2013 +0100
@@ -164,7 +164,7 @@
throws ArtifactDatabaseException
{
if (compatibleFacets == null) {
- logger.warn("No compatible facets, not generating out.");
+ logger.warn("No compatible facets, not generating out " + outputName + ".");
return false;
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/ATWriter.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/ATWriter.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/ATWriter.java Fri Mar 22 11:25:54 2013 +0100
@@ -25,6 +25,7 @@
import org.apache.log4j.Logger;
+/** Write AT files. */
public class ATWriter
{
private static Logger logger = Logger.getLogger(ATWriter.class);
@@ -156,7 +157,7 @@
{
PrintWriter out = new PrintWriter(writer);
- // a header is required, because the desktop version of FLYS will skip
+ // A header is required, because the desktop version of FLYS will skip
// the first row.
if (gName != null) {
printGaugeHeader(out, meta, river, km, gName, datum, date);
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/AbstractExporter.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/AbstractExporter.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/AbstractExporter.java Fri Mar 22 11:25:54 2013 +0100
@@ -74,6 +74,12 @@
/** The master artifact. */
protected Artifact master;
+ private NumberFormat kmFormat;
+
+ private NumberFormat wFormat;
+
+ private NumberFormat qFormat;
+
/**
* Concrete subclasses need to use this method to write their special data
@@ -116,6 +122,12 @@
this.master = master;
}
+ /** Get the callcontext that this exporter has been initialized
+ * with. */
+ public CallContext getCallContext() {
+ return this.context;
+ }
+
@Override
public void setCollection(FLYSArtifactCollection collection) {
@@ -234,6 +246,10 @@
return Resources.getMsg(context.getMeta(), key, def);
}
+ protected String msg(String key, String def, Object[] args) {
+ return Resources.getMsg(context.getMeta(), key, def, args);
+ }
+
/**
* This method starts CSV creation. It makes use of writeCSVData() which has
@@ -295,7 +311,10 @@
* @return the number formatter for kilometer values.
*/
protected NumberFormat getKmFormatter() {
- return Formatter.getWaterlevelKM(context);
+ if (kmFormat == null) {
+ kmFormat = Formatter.getWaterlevelKM(context);
+ }
+ return kmFormat;
}
@@ -305,7 +324,10 @@
* @return the number formatter for W values.
*/
protected NumberFormat getWFormatter() {
- return Formatter.getWaterlevelW(context);
+ if (wFormat == null) {
+ wFormat = Formatter.getWaterlevelW(context);
+ }
+ return wFormat;
}
@@ -315,7 +337,10 @@
* @return the number formatter for Q values.
*/
protected NumberFormat getQFormatter() {
- return Formatter.getWaterlevelQ(context);
+ if (qFormat == null) {
+ qFormat = Formatter.getWaterlevelQ(context);
+ }
+ return qFormat;
}
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/CrossSectionGenerator.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/CrossSectionGenerator.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/CrossSectionGenerator.java Fri Mar 22 11:25:54 2013 +0100
@@ -17,6 +17,7 @@
import de.intevation.artifactdatabase.state.ArtifactAndFacet;
import de.intevation.artifacts.DataProvider;
+import de.intevation.flys.artifacts.FLYSArtifact;
import de.intevation.flys.artifacts.geom.Lines;
import de.intevation.flys.artifacts.model.CrossSectionFacet;
import de.intevation.flys.artifacts.model.FacetTypes;
@@ -28,6 +29,7 @@
import de.intevation.flys.themes.LineStyle;
import de.intevation.flys.themes.TextStyle;
import de.intevation.flys.themes.ThemeAccess;
+import de.intevation.flys.utils.FLYSUtils;
import de.intevation.flys.utils.Formatter;
import de.intevation.flys.utils.ThemeUtil;
@@ -232,7 +234,13 @@
@Override
protected String getDefaultYAxisLabel(int pos) {
- return msg(I18N_YAXIS_LABEL, I18N_YAXIS_LABEL_DEFAULT);
+ FLYSArtifact flys = (FLYSArtifact) master;
+
+ String unit = FLYSUtils.getRiver(flys).getWstUnit().getName();
+
+ return msg(I18N_YAXIS_LABEL,
+ I18N_YAXIS_LABEL_DEFAULT,
+ new Object[] { unit });
}
@@ -341,7 +349,11 @@
if (ThemeUtil.parseShowLevel(theme) && lines.points.length > 1
&& lines.points[1].length > 0) {
NumberFormat nf = Formatter.getMeterFormat(this.context);
- String labelAdd = "W=" + nf.format(lines.points[1][0]) + "NN+m";
+ FLYSArtifact flys = (FLYSArtifact) master;
+
+ String unit = FLYSUtils.getRiver(flys).getWstUnit().getName();
+
+ String labelAdd = "W=" + nf.format(lines.points[1][0]) + unit;
if (series.getLabel().length() == 0) {
series.setLabel(labelAdd);
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/DischargeLongitudinalSectionExporter.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/DischargeLongitudinalSectionExporter.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/DischargeLongitudinalSectionExporter.java Fri Mar 22 11:25:54 2013 +0100
@@ -61,7 +61,8 @@
msg(CSV_KM_HEADER, DEFAULT_CSV_KM_HEADER),
msg(CSV_W_HEADER, DEFAULT_CSV_W_HEADER),
msg(CSV_CW_HEADER, DEFAULT_CSV_CW_HEADER),
- msg(CSV_Q_HEADER, DEFAULT_CSV_Q_HEADER)
+ msg(CSV_Q_HEADER, DEFAULT_CSV_Q_HEADER),
+ msg(CSV_Q_DESC_HEADER, DEFAULT_CSV_Q_DESC_HEADER)
});
}
@@ -70,8 +71,7 @@
CSVWriter writer,
WQKms wqkms,
boolean atGauge,
- boolean isQ,
- boolean isRange
+ boolean isQ
) {
logger.debug("WaterlevelExporter.wQKms2CSV");
@@ -85,6 +85,7 @@
for (int i = 0; i < size; i ++) {
result = wqkms.get(i, result);
+ String name = wqkms.getName();
String wc = "";
if (wqkms instanceof WQCKms) {
wc = wf.format(result[3]);
@@ -94,7 +95,8 @@
kmf.format(result[2]),
wf.format(result[0]),
wc,
- qf.format(result[1])
+ qf.format(result[1]),
+ name
});
}
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/DischargeLongitudinalSectionGenerator.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/DischargeLongitudinalSectionGenerator.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/DischargeLongitudinalSectionGenerator.java Fri Mar 22 11:25:54 2013 +0100
@@ -54,7 +54,7 @@
Facet facet = artifactFacet.getFacet();
- if (name.equals(DISCHARGE_LONGITUDINAL_Q)) {
+ if (name.contains(DISCHARGE_LONGITUDINAL_Q)) {
doQOut(
(WQKms) artifactFacet.getData(context),
artifactFacet,
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/InfoGeneratorHelper.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/InfoGeneratorHelper.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/InfoGeneratorHelper.java Fri Mar 22 11:25:54 2013 +0100
@@ -34,10 +34,10 @@
*/
public class InfoGeneratorHelper {
+ /** Private logging instance. */
private static final Logger logger =
Logger.getLogger(InfoGeneratorHelper.class);
-
protected ChartGenerator generator;
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/WaterlevelExporter.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/WaterlevelExporter.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/WaterlevelExporter.java Fri Mar 22 11:25:54 2013 +0100
@@ -32,10 +32,13 @@
import de.intevation.flys.model.Gauge;
+import de.intevation.flys.artifacts.access.FixRealizingAccess;
import de.intevation.flys.artifacts.access.RangeAccess;
+import de.intevation.flys.artifacts.FixationArtifact;
import de.intevation.flys.artifacts.FLYSArtifact;
import de.intevation.flys.artifacts.WINFOArtifact;
import de.intevation.flys.artifacts.model.CalculationResult;
+import de.intevation.flys.artifacts.model.Segment;
import de.intevation.flys.artifacts.model.WQCKms;
import de.intevation.flys.artifacts.model.WQKms;
import de.intevation.flys.artifacts.model.WKmsJRDataSource;
@@ -107,7 +110,6 @@
public static final String CSV_NOT_IN_GAUGE_RANGE =
"export.waterlevel.csv.not.in.gauge.range";
-
public static final Pattern NUMBERS_PATTERN =
Pattern.compile("\\D*(\\d++.\\d*)\\D*");
@@ -183,7 +185,7 @@
* @param wqkms A WQKms object that should be prepared.
*/
protected String getColumnTitle(WINFOArtifact winfo, WQKms wqkms) {
- logger.debug("WaterlevelExporter.prepareNamedValue");
+ logger.debug("WaterlevelExporter.getColumnTitle");
String name = wqkms.getName();
@@ -424,10 +426,12 @@
) {
logger.info("WaterlevelExporter.writeCSVHeader");
+ String unit = FLYSUtils.getRiver((FLYSArtifact) master).getWstUnit().getName();
+
if (atGauge) {
writer.writeNext(new String[] {
msg(CSV_KM_HEADER, DEFAULT_CSV_KM_HEADER),
- msg(CSV_W_HEADER, DEFAULT_CSV_W_HEADER),
+ msg(CSV_W_HEADER, DEFAULT_CSV_W_HEADER, new Object[] { unit }),
msg(CSV_Q_HEADER, DEFAULT_CSV_Q_HEADER),
(isQ
? msg(CSV_Q_DESC_HEADER, DEFAULT_CSV_Q_DESC_HEADER)
@@ -439,7 +443,8 @@
else {
writer.writeNext(new String[] {
msg(CSV_KM_HEADER, DEFAULT_CSV_KM_HEADER),
- msg(CSV_W_HEADER, DEFAULT_CSV_W_HEADER),
+ // TODO flys/issue1128 (unit per river)
+ msg(CSV_W_HEADER, DEFAULT_CSV_W_HEADER, new Object[] { unit }),
msg(CSV_Q_HEADER, DEFAULT_CSV_Q_HEADER),
msg(CSV_LOCATION_HEADER, DEFAULT_CSV_LOCATION_HEADER)
});
@@ -447,6 +452,50 @@
}
+ /** Linearly search for gauge which is valid at km. */
+ private Gauge findGauge(double km, List<Gauge> gauges) {
+ for (Gauge gauge: gauges) {
+ if (km >= gauge.getRange().getA().doubleValue()
+ && km <= gauge.getRange().getB().doubleValue()) {
+ return gauge;
+ }
+ }
+ return null;
+ }
+
+
+ private void writeRow4(CSVWriter writer, double wqkm[], FLYSArtifact flys) {
+ NumberFormat kmf = getKmFormatter();
+ NumberFormat wf = getWFormatter();
+ NumberFormat qf = getQFormatter();
+
+ writer.writeNext(new String[] {
+ kmf.format(wqkm[2]),
+ wf.format(wqkm[0]),
+ qf.format(wqkm[1]),
+ FLYSUtils.getLocationDescription(flys, wqkm[2])
+ });
+ }
+
+
+ /** Write an csv-row at gauge location. */
+ private void writeRow6(CSVWriter writer, double wqkm[], String wOrQDesc,
+ FLYSArtifact flys, String gaugeName) {
+ NumberFormat kmf = getKmFormatter();
+ NumberFormat wf = getWFormatter();
+ NumberFormat qf = getQFormatter();
+
+ writer.writeNext(new String[] {
+ kmf.format(wqkm[2]),
+ wf.format(wqkm[0]),
+ qf.format(wqkm[1]),
+ wOrQDesc,
+ FLYSUtils.getLocationDescription(flys, wqkm[2]),
+ gaugeName
+ });
+ }
+
+
/**
* Write "rows" of csv data from wqkms with writer.
*/
@@ -471,6 +520,7 @@
double[] result = new double[3];
FLYSArtifact flys = (FLYSArtifact) master;
+ List<Gauge> gauges = FLYSUtils.getGauges(flys);
Gauge gauge = FLYSUtils.getGauge(flys);
String gaugeName = gauge.getName();
String desc = "";
@@ -493,36 +543,50 @@
long startTime = System.currentTimeMillis();
String colDesc = desc;
+ List<Segment> segments = null;
+ boolean isFixRealize = false;
if (flys instanceof WINFOArtifact) {
if (wqkms != null && wqkms.getRawValue() != null) {
WINFOArtifact winfo = (WINFOArtifact) flys;
colDesc = FLYSUtils.getNamedMainValue(winfo, wqkms.getRawValue());
}
}
+ else if (flys instanceof FixationArtifact) {
+ // Get W/Q input per gauge for this case.
+ FixRealizingAccess fixAccess = new FixRealizingAccess(flys, getCallContext());
+ segments = fixAccess.getSegments();
+ if (segments != null && segments.size() > 0) {
+ isFixRealize = true;
+ }
+ }
for (int i = 0; i < size; i ++) {
result = wqkms.get(i, result);
+ // Check if there has been W input per Gauge and use it.
+ if (segments != null) {
+ for (Segment segment: segments) {
+ if (segment.inside(result[2])) {
+ colDesc = "" + segment.getValues()[0];
+ }
+ }
+ }
+
if (atGauge) {
- writer.writeNext(new String[] {
- kmf.format(result[2]),
- wf.format(result[0]),
- qf.format(result[1]),
- colDesc,
- FLYSUtils.getLocationDescription(flys, result[2]),
+ String gaugeN;
+ if (isFixRealize) {
+ gaugeN = findGauge(result[2], gauges).getName();
+ }
+ else {
// TODO issue1114: Take correct gauge
- result[2] >= a && result[2] <= b
+ gaugeN = result[2] >= a && result[2] <= b
? gaugeName
- : notinrange
- });
+ : notinrange;
+ }
+ writeRow6(writer, result, colDesc, flys, gaugeN);
}
else {
- writer.writeNext(new String[] {
- kmf.format(result[2]),
- wf.format(result[0]),
- qf.format(result[1]),
- FLYSUtils.getLocationDescription(flys, result[2])
- });
+ writeRow4(writer, result, flys);
}
}
@@ -559,6 +623,9 @@
for (WQKms[] tmp: data) {
for (WQKms wqkms: tmp) {
+ if (wqkms instanceof ConstantWQKms) {
+ continue;
+ }
int size = wqkms != null ? wqkms.size() : 0;
addWSTColumn(writer, wqkms);
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/fixings/FixATExport.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/fixings/FixATExport.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/fixings/FixATExport.java Fri Mar 22 11:25:54 2013 +0100
@@ -14,7 +14,6 @@
import de.intevation.flys.artifacts.model.CalculationResult;
import de.intevation.flys.artifacts.model.Parameters;
-import de.intevation.flys.artifacts.model.fixings.FixAnalysisResult;
import de.intevation.flys.artifacts.model.fixings.FixResult;
import de.intevation.flys.exports.AbstractExporter;
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/fixings/FixWQCurveGenerator.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/fixings/FixWQCurveGenerator.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/fixings/FixWQCurveGenerator.java Fri Mar 22 11:25:54 2013 +0100
@@ -240,7 +240,7 @@
}
}
else {
- logger.debug("doAnalysisEventsOut: qwds == null");
+ logger.debug("doReferenceEventsOut: qwds == null");
}
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/minfo/BedDifferenceEpochGenerator.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/minfo/BedDifferenceEpochGenerator.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/minfo/BedDifferenceEpochGenerator.java Fri Mar 22 11:25:54 2013 +0100
@@ -206,5 +206,4 @@
addAxisSeries(series, YAXIS.H.idx, visible);
}
-
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/exports/process/WOutProcessor.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/exports/process/WOutProcessor.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/exports/process/WOutProcessor.java Fri Mar 22 11:25:54 2013 +0100
@@ -26,10 +26,10 @@
@Override
public void doOut(
XYChartGenerator generator,
- ArtifactAndFacet aaf,
- Document theme,
- boolean visible,
- int index)
+ ArtifactAndFacet aaf,
+ Document theme,
+ boolean visible,
+ int index)
{
CallContext context = generator.getCallContext();
WKms wkms = (WKms) aaf.getData(context);
@@ -62,7 +62,7 @@
}
/**
- * Returns true if facettype is longitutinal_section.w
+ * Returns true if facettype is longitutinal_section.w .
*/
@Override
public boolean canHandle(String facettype) {
@@ -109,5 +109,5 @@
}
generator.setInverted(inv);
}
-
}
+// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/mapserver/RiverMapfileGenerator.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/mapserver/RiverMapfileGenerator.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/mapserver/RiverMapfileGenerator.java Fri Mar 22 11:25:54 2013 +0100
@@ -1,11 +1,10 @@
package de.intevation.flys.mapserver;
import com.vividsolutions.jts.geom.Envelope;
-import com.vividsolutions.jts.geom.LineString;
+import com.vividsolutions.jts.geom.MultiLineString;
import de.intevation.flys.artifacts.model.LayerInfo;
import de.intevation.flys.artifacts.model.RiverFactory;
-
import de.intevation.flys.model.River;
import de.intevation.flys.model.RiverAxis;
import de.intevation.flys.utils.FLYSUtils;
@@ -13,14 +12,11 @@
import java.io.File;
import java.io.FileNotFoundException;
-
import java.util.ArrayList;
import java.util.List;
-
import java.util.regex.Pattern;
import org.apache.log4j.Logger;
-
import org.apache.velocity.Template;
public class RiverMapfileGenerator extends MapfileGenerator {
@@ -73,7 +69,7 @@
logger.warn("River " + river.getName() + " has no river axis!");
continue;
}
- LineString geom = riverAxis.get(0).getGeom();
+ MultiLineString geom = riverAxis.get(0).getGeom();
Envelope extent = geom.getEnvelopeInternal();
createRiverAxisLayer(
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/utils/ArtifactMapfileGenerator.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/utils/ArtifactMapfileGenerator.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,340 @@
+package de.intevation.flys.utils;
+
+import de.intevation.artifacts.CallContext;
+import de.intevation.flys.artifacts.FLYSArtifact;
+import de.intevation.flys.artifacts.access.RiverAccess;
+import de.intevation.flys.artifacts.model.LayerInfo;
+import de.intevation.flys.artifacts.model.map.WMSDBLayerFacet;
+import de.intevation.flys.artifacts.model.map.WMSLayerFacet;
+import de.intevation.flys.artifacts.model.map.WSPLGENLayerFacet;
+import de.intevation.flys.artifacts.resources.Resources;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.IOException;
+import java.util.List;
+
+import org.apache.log4j.Logger;
+import org.apache.velocity.Template;
+import org.geotools.data.shapefile.ShpFiles;
+import org.geotools.data.shapefile.shp.ShapefileHeader;
+import org.geotools.data.shapefile.shp.ShapefileReader;
+
+public class ArtifactMapfileGenerator extends MapfileGenerator {
+
+ private static Logger logger = Logger.getLogger(ArtifactMapfileGenerator.class);
+
+ @Override
+ protected String getVelocityLogfile() {
+ return FLYSUtils.getXPathString(FLYSUtils.XPATH_FLOODMAP_VELOCITY_LOGFILE);
+ }
+
+ @Override
+ protected String getMapserverTemplatePath() {
+ return FLYSUtils.getXPathString(FLYSUtils.XPATH_FLOODMAP_MAPSERVER_TEMPLATE_PATH);
+ }
+
+ @Override
+ public String getMapserverUrl() {
+ return FLYSUtils.getXPathString(FLYSUtils.XPATH_FLOODMAP_MAPSERVER_URL);
+ }
+
+ /**
+ * Method which starts searching for meta information file and mapfile
+ * generation.
+ */
+ @Override
+ public void generate() throws IOException
+ {
+ File[] userDirs = getUserDirs();
+ List<String> layers = parseLayers(userDirs);
+ logger.info("Found " + layers.size() + " layers for user mapfile.");
+
+ writeMapfile(layers);
+ }
+
+ /**
+ * Creates a layer file used for Mapserver's mapfile which represents the
+ * floodmap.
+ *
+ * @param flys The FLYSArtifact that owns <i>wms</i>.
+ * @param wms The WMSLayerFacet that contains information for the layer.
+ */
+ public void createUeskLayer(
+ FLYSArtifact flys,
+ WSPLGENLayerFacet wms,
+ String style,
+ CallContext context
+ ) throws FileNotFoundException, IOException
+ {
+ logger.debug("createUeskLayer");
+
+ LayerInfo layerinfo = new LayerInfo();
+ layerinfo.setName(MS_WSPLGEN_PREFIX + flys.identifier());
+ layerinfo.setType("POLYGON");
+ layerinfo.setDirectory(flys.identifier());
+ layerinfo.setData(WSPLGEN_RESULT_SHAPE);
+ layerinfo.setTitle(Resources.getMsg(Resources.getLocale(context.getMeta()),
+ "floodmap.uesk",
+ "Floodmap"));
+ layerinfo.setStyle(style);
+ RiverAccess access = new RiverAccess(flys);
+ String river = access.getRiver();
+ layerinfo.setSrid(FLYSUtils.getRiverDGMSrid(river));
+
+ String name = MS_LAYER_PREFIX + wms.getName();
+
+ Template template = getTemplateByName(WSPLGEN_LAYER_TEMPLATE);
+ if (template == null) {
+ logger.warn("Template '" + WSPLGEN_LAYER_TEMPLATE + "' found.");
+ return;
+ }
+
+ try {
+ File dir = new File(getShapefileBaseDir(), flys.identifier());
+ writeLayer(layerinfo, new File(dir, name), template);
+ }
+ catch (FileNotFoundException fnfe) {
+ logger.error(fnfe, fnfe);
+ logger.warn("Unable to write layer: " + name);
+ }
+ }
+
+
+ /**
+ * Creates a layer file used for Mapserver's mapfile which represents the
+ * user defined barriers.
+ *
+ * @param flys The FLYSArtifact that owns <i>wms</i>.
+ * @param wms The WMSLayerFacet that contains information for the layer.
+ */
+ public void createBarriersLayer(FLYSArtifact flys, WMSLayerFacet wms)
+ throws FileNotFoundException, IOException
+ {
+ logger.debug("createBarriersLayer");
+
+ //String uuid = flys.identifier();
+ //File dir = new File(getShapefileBaseDir(), uuid);
+
+ createBarriersLineLayer(flys, wms);
+ createBarriersPolygonLayer(flys, wms);
+ }
+
+
+ protected void createBarriersLineLayer(
+ FLYSArtifact flys,
+ WMSLayerFacet wms
+ )
+ throws FileNotFoundException, IOException
+ {
+ String uuid = flys.identifier();
+ String group = MS_BARRIERS_PREFIX + uuid;
+ String groupTitle = "I18N_BARRIERS_TITLE";
+
+ File dir = new File(getShapefileBaseDir(), uuid);
+ File test = new File(dir, WSPLGEN_LINES_SHAPE);
+
+ if (!test.exists() || !test.canRead()) {
+ logger.debug("No barrier line layer existing.");
+ return;
+ }
+
+ LayerInfo lineInfo = new LayerInfo();
+ lineInfo.setName(MS_LINE_PREFIX + uuid);
+ lineInfo.setType("LINE");
+ lineInfo.setDirectory(uuid);
+ lineInfo.setData(WSPLGEN_LINES_SHAPE);
+ lineInfo.setTitle("I18N_LINE_SHAPE");
+ lineInfo.setGroup(group);
+ lineInfo.setGroupTitle(groupTitle);
+ lineInfo.setSrid(wms.getSrid());
+
+ String nameLines = MS_LAYER_PREFIX + wms.getName() + "-lines";
+
+ Template tpl = getTemplateByName(SHP_LAYER_TEMPLATE);
+ if (tpl == null) {
+ logger.warn("Template '" + SHP_LAYER_TEMPLATE + "' found.");
+ return;
+ }
+
+ try {
+ writeLayer(lineInfo, new File(dir, nameLines), tpl);
+ }
+ catch (FileNotFoundException fnfe) {
+ logger.error(fnfe, fnfe);
+ logger.warn("Unable to write layer: " + nameLines);
+ }
+ }
+
+ protected void createBarriersPolygonLayer(
+ FLYSArtifact flys,
+ WMSLayerFacet wms
+ )
+ throws FileNotFoundException, IOException
+ {
+ String uuid = flys.identifier();
+ String group = uuid + MS_BARRIERS_PREFIX;
+ String groupTitle = "I18N_BARRIERS_TITLE";
+
+ File dir = new File(getShapefileBaseDir(), uuid);
+ File test = new File(dir, WSPLGEN_POLYGONS_SHAPE);
+
+ if (!test.exists() || !test.canRead()) {
+ logger.debug("No barrier line layer existing.");
+ return;
+ }
+
+ LayerInfo polygonInfo = new LayerInfo();
+ polygonInfo.setName(MS_POLYGONS_PREFIX + uuid);
+ polygonInfo.setType("POLYGON");
+ polygonInfo.setDirectory(uuid);
+ polygonInfo.setData(WSPLGEN_POLYGONS_SHAPE);
+ polygonInfo.setTitle("I18N_POLYGON_SHAPE");
+ polygonInfo.setGroup(group);
+ polygonInfo.setGroupTitle(groupTitle);
+ polygonInfo.setSrid(wms.getSrid());
+
+ String namePolygons = MS_LAYER_PREFIX + wms.getName() + "-polygons";
+
+ Template tpl = getTemplateByName(SHP_LAYER_TEMPLATE);
+ if (tpl == null) {
+ logger.warn("Template '" + SHP_LAYER_TEMPLATE + "' found.");
+ return;
+ }
+
+ try {
+ writeLayer(polygonInfo, new File(dir, namePolygons), tpl);
+ }
+ catch (FileNotFoundException fnfe) {
+ logger.error(fnfe, fnfe);
+ logger.warn("Unable to write layer: " + namePolygons);
+ }
+ }
+
+
+ /**
+ * Creates a layer file used for Mapserver's mapfile which represents the
+ * shape files uploaded by the user.
+ *
+ * @param flys The FLYSArtifact that owns <i>wms</i>.
+ * @param wms The WMSLayerFacet that contains information for the layer.
+ */
+ public void createUserShapeLayer(FLYSArtifact flys, WMSLayerFacet wms)
+ throws FileNotFoundException, IOException
+ {
+ logger.debug("createUserShapeLayer");
+
+ String uuid = flys.identifier();
+ File dir = new File(getShapefileBaseDir(), uuid);
+ File test = new File(dir, WSPLGEN_USER_SHAPE);
+
+ if (!test.exists() || !test.canRead()) {
+ logger.debug("No user layer existing.");
+ return;
+ }
+
+ File userShape = new File(dir, WSPLGEN_USER_SHAPE);
+ ShpFiles sf = new ShpFiles(userShape);
+ ShapefileReader sfr = new ShapefileReader(sf, true, false, null);
+ ShapefileHeader sfh = sfr.getHeader();
+
+ String group = uuid + MS_USERSHAPE_PREFIX;
+ String groupTitle = "I18N_USER_SHAPE_TITLE";
+
+ LayerInfo info = new LayerInfo();
+ info.setName(MS_USERSHAPE_PREFIX + uuid);
+ if (sfh.getShapeType().isLineType()) {
+ info.setType("LINE");
+ }
+ else if (sfh.getShapeType().isPolygonType()) {
+ info.setType("POLYGON");
+ }
+ else {
+ return;
+ }
+ info.setDirectory(uuid);
+ info.setData(WSPLGEN_USER_SHAPE);
+ info.setTitle("I18N_USER_SHAPE");
+ info.setGroup(group);
+ info.setGroupTitle(groupTitle);
+ info.setSrid(wms.getSrid());
+
+ String nameUser = MS_LAYER_PREFIX + wms.getName();
+
+ Template tpl = getTemplateByName(SHP_LAYER_TEMPLATE);
+ if (tpl == null) {
+ logger.warn("Template '" + SHP_LAYER_TEMPLATE + "' found.");
+ return;
+ }
+
+ try {
+ writeLayer(info, new File(dir, nameUser), tpl);
+ }
+ catch (FileNotFoundException fnfe) {
+ logger.error(fnfe, fnfe);
+ logger.warn("Unable to write layer: " + nameUser);
+ }
+
+ }
+
+
+ /**
+ * Creates a layer file used for Mapserver's mapfile which represents
+ * geometries from database.
+ *
+ * @param flys The FLYSArtifact that owns <i>wms</i>.
+ * @param wms The WMSLayerFacet that contains information for the layer.
+ */
+ public void createDatabaseLayer(
+ FLYSArtifact flys,
+ WMSDBLayerFacet wms,
+ String style
+ )
+ throws FileNotFoundException, IOException
+ {
+ logger.debug("createDatabaseLayer");
+
+ LayerInfo layerinfo = new LayerInfo();
+ layerinfo.setName(wms.getName() + "-" + flys.identifier());
+ layerinfo.setType(wms.getGeometryType());
+ layerinfo.setFilter(wms.getFilter());
+ layerinfo.setData(wms.getData());
+ layerinfo.setTitle(wms.getDescription());
+ layerinfo.setStyle(style);
+ if(wms.getExtent() != null) {
+ layerinfo.setExtent(GeometryUtils.jtsBoundsToOLBounds(wms.getExtent()));
+ }
+ layerinfo.setConnection(wms.getConnection());
+ layerinfo.setConnectionType(wms.getConnectionType());
+ layerinfo.setLabelItem(wms.getLabelItem());
+ layerinfo.setSrid(wms.getSrid());
+
+ String name = MS_LAYER_PREFIX + wms.getName();
+
+ Template template = getTemplateByName(DB_LAYER_TEMPLATE);
+ if (template == null) {
+ logger.warn("Template '" + DB_LAYER_TEMPLATE + "' found.");
+ return;
+ }
+
+ try {
+ File dir = new File(getShapefileBaseDir(), flys.identifier());
+ writeLayer(layerinfo, new File(dir, name), template);
+ }
+ catch (FileNotFoundException fnfe) {
+ logger.error(fnfe, fnfe);
+ logger.warn("Unable to write layer: " + name);
+ }
+ }
+
+ @Override
+ protected String getMapfilePath() {
+ return FLYSUtils.getXPathString(FLYSUtils.XPATH_FLOODMAP_MAPFILE_PATH);
+ }
+
+ @Override
+ protected String getMapfileTemplate() {
+ return FLYSUtils.getXPathString(FLYSUtils.XPATH_FLOODMAP_MAPFILE_TEMPLATE);
+ }
+
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/utils/DoubleUtil.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/utils/DoubleUtil.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/utils/DoubleUtil.java Fri Mar 22 11:25:54 2013 +0100
@@ -16,6 +16,9 @@
public static final double DEFAULT_STEP_PRECISION = 1e6;
+ /** EPSILON for comparison of double precision values. */
+ public static final double EPSILON = 1e-4;
+
private DoubleUtil() {
}
@@ -31,6 +34,10 @@
return Math.round(x * DEFAULT_STEP_PRECISION)/DEFAULT_STEP_PRECISION;
}
+ /**
+ * Returns array with values from parameter from to to with given step width.
+ * from and to are included.
+ */
public static final double [] explode(
double from,
double to,
@@ -56,7 +63,7 @@
double max = Math.max(from, to);
for (int idx = 0; idx < num; idx++) {
- if (lower > max) {
+ if (lower - max > EPSILON) {
return Arrays.copyOfRange(values, 0, idx);
}
@@ -177,7 +184,7 @@
vs.resetQuick();
- for (String valueStr: parts[2].split(",")) {
+ for (String valueStr: parts[3].split(",")) {
vs.add(round(Double.parseDouble(valueStr.trim())));
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/utils/GeometryUtils.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/utils/GeometryUtils.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/utils/GeometryUtils.java Fri Mar 22 11:25:54 2013 +0100
@@ -30,6 +30,7 @@
import org.geotools.geometry.jts.JTS;
import org.geotools.geometry.jts.ReferencedEnvelope;
import org.geotools.referencing.CRS;
+import org.hibernate.HibernateException;
import org.opengis.feature.simple.SimpleFeature;
import org.opengis.feature.simple.SimpleFeatureType;
import org.opengis.referencing.FactoryException;
@@ -50,24 +51,30 @@
}
public static Envelope getRiverBoundary(String rivername) {
- List<RiverAxis> axes = RiverAxis.getRiverAxis(rivername);
- if (axes != null && axes.size() > 0) {
- Envelope max = null;
+ try {
+ List<RiverAxis> axes = RiverAxis.getRiverAxis(rivername);
+ if (axes != null && axes.size() > 0) {
+ Envelope max = null;
- for (RiverAxis axis: axes) {
- // TODO Take the correct EPSG into account. Maybe, we need to
- // reproject the geometry.
- Envelope env = axis.getGeom().getEnvelopeInternal();
+ for (RiverAxis axis: axes) {
+ // TODO Take the correct EPSG into account. Maybe, we need to
+ // reproject the geometry.
+ Envelope env = axis.getGeom().getEnvelopeInternal();
- if (max == null) {
- max = env;
+ if (max == null) {
+ max = env;
+ }
+ else {
+ max.expandToInclude(env);
+ }
}
- else {
- max.expandToInclude(env);
- }
+
+ return max;
}
-
- return max;
+ }
+ catch(HibernateException iae) {
+ logger.warn("No vaild river axis forund for " + rivername);
+ return null;
}
return null;
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/utils/MapUtils.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/utils/MapUtils.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/utils/MapUtils.java Fri Mar 22 11:25:54 2013 +0100
@@ -14,10 +14,10 @@
private static final Logger logger = Logger.getLogger(MapUtils.class);
public static final Pattern DB_URL_PATTERN =
- Pattern.compile("(.*)\\/\\/(.*):([0-9]+)\\/([a-zA-Z]+)");
+ Pattern.compile("(.*)\\/\\/(.*):([0-9]+)\\/([\\.a-zA-Z0-9_-]+)");
public static final Pattern DB_PSQL_URL_PATTERN =
- Pattern.compile("(.*)\\/\\/(.*):([0-9]+)\\/([a-zA-Z0-9]+)");
+ Pattern.compile("(.*)\\/\\/(.*):([0-9]+)\\/([a-zA-Z0-9_-]+)");
/**
* This method returns a connection string for databases used by
@@ -46,22 +46,25 @@
logger.debug("Groups for connection string: " + m.groupCount());
int groups = m.groupCount();
- for (int i = 0; i <= groups; i++) {
- logger.debug("Group " + i + ": " + m.group(i));
+
+ if (logger.isDebugEnabled()) {
+ for (int i = 0; i <= groups; i++) {
+ logger.debug("Group " + i + ": " + m.group(i));
+ }
}
String connection = null;
if (FLYSUtils.isUsingOracle()) {
- if (groups < 3) {
+ if (groups < 4) {
logger.warn("Could only partially parse connection string.");
return null;
}
String host = m.group(2);
String port = m.group(3);
-
- connection = user + "/" + pass + "@" + host;
+ String backend = m.group(4);
+ connection = user + "/" + pass + "@" + host + "/" + backend;
}
else {
if (groups < 4) {
@@ -73,19 +76,30 @@
String port = m.group(3);
String db = m.group(4);
- StringBuilder sb = new StringBuilder();
- sb.append("dbname=" + db);
- sb.append("host='" + host + "'");
- sb.append("port=" + port);
- sb.append("password='" + pass + "'");
- sb.append("sslmode=disable");
-
- connection = sb.toString();
+ connection = createConnectionString(user, pass, host, db, port);
}
return connection;
}
+ public static String createConnectionString(
+ String user,
+ String pass,
+ String host,
+ String db,
+ String port
+ ) {
+ StringBuilder sb = new StringBuilder();
+ sb.append("dbname=").append(db);
+ sb.append(" host='").append(host).append("'");
+ sb.append(" user=").append(user);
+ sb.append(" port=").append(port);
+ // XXX: We need to escape this somehow.
+ sb.append(" password='").append(pass).append("'");
+ sb.append(" sslmode=disable");
+ return sb.toString();
+ }
+
protected static String getPostgreSQLConnection() {
SessionFactoryImpl sf = (SessionFactoryImpl)
SessionFactoryProvider.getSessionFactory();
@@ -120,15 +134,7 @@
String port = m.group(3);
String db = m.group(4);
- StringBuilder sb = new StringBuilder();
- sb.append("dbname=" + db);
- sb.append(" host='" + host + "'");
- sb.append(" port=" + port);
- sb.append(" user=" + user);
- sb.append(" password='" + pass + "'");
- sb.append(" sslmode=disable");
-
- connection = sb.toString();
+ connection = createConnectionString(user, pass, host, db, port);
logger.debug("Created connection: '" + connection + "'");
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/utils/MapfileGenerator.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/utils/MapfileGenerator.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,384 @@
+package de.intevation.flys.utils;
+
+import de.intevation.artifacts.common.utils.Config;
+import de.intevation.flys.artifacts.model.LayerInfo;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.FileWriter;
+import java.io.FilenameFilter;
+import java.io.IOException;
+import java.io.Writer;
+import java.util.ArrayList;
+import java.util.Date;
+import java.util.List;
+
+import org.apache.log4j.Logger;
+import org.apache.velocity.Template;
+import org.apache.velocity.VelocityContext;
+import org.apache.velocity.app.VelocityEngine;
+import org.apache.velocity.runtime.RuntimeConstants;
+
+/**
+ * This class iterates over a bunch of directories, searches for meta
+ * information coresponding to shapefiles and creates a mapfile which is used by
+ * a <i>MapServer</i>.
+ *
+ * @author <a href="mailto:ingo.weinzierl at intevation.de">Ingo Weinzierl</a>
+ */
+public abstract class MapfileGenerator
+{
+ public static final String WSPLGEN_RESULT_SHAPE = "wsplgen.shp";
+ public static final String WSPLGEN_LINES_SHAPE = "barrier_lines.shp";
+ public static final String WSPLGEN_POLYGONS_SHAPE = "barrier_polygons.shp";
+ public static final String WSPLGEN_USER_SHAPE = "user-rgd.shp";
+
+ public static final String WSPLGEN_LAYER_TEMPLATE = "wsplgen_layer.vm";
+ public static final String SHP_LAYER_TEMPLATE = "shapefile_layer.vm";
+ public static final String DB_LAYER_TEMPLATE = "db_layer.vm";
+ public static final String RIVERAXIS_LAYER_TEMPLATE = "riveraxis-layer.vm";
+
+ public static final String MS_WSPLGEN_PREFIX = "wsplgen-";
+ public static final String MS_BARRIERS_PREFIX = "barriers-";
+ public static final String MS_LINE_PREFIX = "lines-";
+ public static final String MS_POLYGONS_PREFIX = "polygons-";
+ public static final String MS_LAYER_PREFIX = "ms_layer-";
+ public static final String MS_USERSHAPE_PREFIX = "user-";
+
+ private static Logger logger = Logger.getLogger(MapfileGenerator.class);
+
+ private File shapefileDirectory;
+
+ private VelocityEngine velocityEngine;
+
+
+ protected MapfileGenerator() {
+ }
+
+
+ /**
+ * Method to check the existance of a template file.
+ *
+ * @param templateID The name of a template.
+ * @return true, of the template exists - otherwise false.
+ */
+ public boolean templateExists(String templateID){
+ Template template = getTemplateByName(templateID);
+ return template != null;
+ }
+
+
+ public abstract void generate() throws Exception;
+
+
+ /**
+ * Returns the VelocityEngine used for the template mechanism.
+ *
+ * @return the velocity engine.
+ */
+ protected VelocityEngine getVelocityEngine() {
+ if (velocityEngine == null) {
+ velocityEngine = new VelocityEngine();
+ try {
+ setupVelocity(velocityEngine);
+ }
+ catch (Exception e) {
+ logger.error(e, e);
+ return null;
+ }
+ }
+ return velocityEngine;
+ }
+
+
+ /**
+ * Initialize velocity.
+ *
+ * @param engine Velocity engine.
+ * @throws Exception if an error occured while initializing velocity.
+ */
+ protected void setupVelocity(VelocityEngine engine)
+ throws Exception
+ {
+ engine.setProperty(
+ "input.encoding",
+ "UTF-8");
+
+ engine.setProperty(
+ RuntimeConstants.RUNTIME_LOG,
+ getVelocityLogfile());
+
+ engine.setProperty(
+ "resource.loader",
+ "file");
+
+ engine.setProperty(
+ "file.resource.loader.path",
+ getMapserverTemplatePath());
+
+ engine.init();
+ }
+
+ protected abstract String getVelocityLogfile();
+
+ protected abstract String getMapserverTemplatePath();
+
+ public abstract String getMapserverUrl();
+
+ protected VelocityContext getVelocityContext() {
+ VelocityContext context = new VelocityContext();
+
+ try {
+ context.put("MAPSERVERURL",
+ getMapserverUrl());
+ context.put("SHAPEFILEPATH",
+ getShapefileBaseDir().getCanonicalPath());
+ context.put("CONFIGDIR",
+ Config.getConfigDirectory().getCanonicalPath());
+ }
+ catch (FileNotFoundException fnfe) {
+ // this is bad
+ logger.warn(fnfe, fnfe);
+ }
+ catch (IOException ioe) {
+ // this is also bad
+ logger.warn(ioe, ioe);
+ }
+
+ return context;
+ }
+
+
+ /**
+ * Returns a template specified by <i>model</i>.
+ *
+ * @param model The name of the template.
+ * @return a template.
+ */
+ public Template getTemplateByName(String model) {
+ if (model.indexOf(".vm") < 0) {
+ model = model.concat(".vm");
+ }
+
+ try {
+ VelocityEngine engine = getVelocityEngine();
+ if (engine == null) {
+ logger.error("Error while fetching VelocityEngine.");
+ return null;
+ }
+
+ return engine.getTemplate(model);
+ }
+ catch (Exception e) {
+ logger.warn(e, e);
+ }
+
+ return null;
+ }
+
+
+ /**
+ * Returns the mapfile template.
+ *
+ * @return the mapfile template.
+ * @throws Exception if an error occured while reading the configuration.
+ */
+ protected Template getMapfileTemplateObj()
+ throws Exception
+ {
+ String mapfileName = getMapfileTemplate();
+ return getTemplateByName(mapfileName);
+ }
+
+ protected abstract String getMapfilePath();
+
+ protected abstract String getMapfileTemplate();
+
+
+ /**
+ * Returns the base directory storing the shapefiles.
+ *
+ * @return the shapefile base directory.
+ *
+ * @throws FileNotFoundException if no shapefile path is found or
+ * configured.
+ */
+ public File getShapefileBaseDir()
+ throws FileNotFoundException, IOException
+ {
+ if (shapefileDirectory == null) {
+ String path = FLYSUtils.getXPathString(
+ FLYSUtils.XPATH_FLOODMAP_SHAPEFILE_DIR);
+
+ if (path != null) {
+ shapefileDirectory = new File(path);
+ }
+
+ if (shapefileDirectory == null) {
+ throw new FileNotFoundException("No shapefile directory given");
+ }
+
+ if (!shapefileDirectory.exists()) {
+ shapefileDirectory.mkdirs();
+ }
+ }
+
+ return shapefileDirectory;
+ }
+
+
+ protected File[] getUserDirs()
+ throws FileNotFoundException, IOException
+ {
+ File baseDir = getShapefileBaseDir();
+ File[] artifactDirs = baseDir.listFiles();
+
+ // TODO ONLY RETURN DIRECTORIES OF THE SPECIFIED USER
+
+ return artifactDirs;
+ }
+
+
+
+ protected List<String> parseLayers(File[] dirs) {
+ List<String> layers = new ArrayList<String>();
+
+ for (File dir: dirs) {
+ File[] layerFiles = dir.listFiles(new FilenameFilter() {
+ @Override
+ public boolean accept(File directory, String name) {
+ return name.startsWith(MS_LAYER_PREFIX);
+ }
+ });
+
+ for (File layer: layerFiles) {
+ try {
+ layers.add(layer.getCanonicalPath());
+ }
+ catch (IOException ioe) {
+ logger.warn(ioe, ioe);
+ }
+ }
+ }
+
+ return layers;
+ }
+
+
+
+
+ /**
+ * Creates a layer snippet which might be included in the mapfile.
+ *
+ * @param layerinfo A LayerInfo object that contains all necessary
+ * information to build a Mapserver LAYER section.
+ * @param dir The base dir for the LAYER snippet.
+ * @param filename The name of the file that is written.
+ * @param tpl The Velocity template which is used to create the LAYER
+ * section.
+ */
+ public void writeLayer(
+ LayerInfo layerInfo,
+ File layerFile,
+ Template tpl
+ )
+ throws FileNotFoundException
+ {
+ if (logger.isDebugEnabled()) {
+ logger.debug("Write layer for:");
+ logger.debug(" directory/file: " + layerFile.getName());
+ }
+
+ Writer writer = null;
+
+ try {
+ writer = new FileWriter(layerFile);
+
+ VelocityContext context = getVelocityContext();
+ context.put("LAYER", layerInfo);
+
+ tpl.merge(context, writer);
+ }
+ catch (FileNotFoundException fnfe) {
+ logger.error(fnfe, fnfe);
+ }
+ catch (IOException ioe) {
+ logger.error(ioe, ioe);
+ }
+ catch (Exception e) {
+ logger.error(e, e);
+ }
+ finally {
+ try {
+ if (writer != null) {
+ writer.close();
+ }
+ }
+ catch (IOException ioe) {
+ logger.debug(ioe, ioe);
+ }
+ }
+ }
+
+
+ /**
+ * Creates a mapfile with the layer information stored in <i>layers</i>.
+ *
+ * @param layers Layer information.
+ */
+ public void writeMapfile(List<String> layers) {
+ String tmpMapName = "mapfile" + new Date().getTime();
+
+ File mapfile = new File(getMapfilePath());
+
+ File tmp = null;
+ Writer writer = null;
+
+ try {
+ tmp = new File(mapfile.getParent(), tmpMapName);
+ tmp.createNewFile();
+
+ writer = new FileWriter(tmp);
+
+ VelocityContext context = getVelocityContext();
+ context.put("LAYERS", layers);
+
+ Template mapTemplate = getMapfileTemplateObj();
+ if (mapTemplate == null) {
+ logger.warn("No mapfile template found.");
+ return;
+ }
+
+ mapTemplate.merge(context, writer);
+
+ // we need to create a temporary mapfile first und rename it into
+ // real mapfile because we don't run into race conditions on this
+ // way. (iw)
+ tmp.renameTo(mapfile);
+ }
+ catch (FileNotFoundException fnfe) {
+ logger.error(fnfe, fnfe);
+ }
+ catch (IOException ioe) {
+ logger.error(ioe, ioe);
+ }
+ catch (Exception e) {
+ logger.error(e, e);
+ }
+ finally {
+ try {
+ if (writer != null) {
+ writer.close();
+ }
+
+ if (tmp.exists()) {
+ tmp.delete();
+ }
+ }
+ catch (IOException ioe) {
+ logger.debug(ioe, ioe);
+ }
+ }
+ }
+}
+// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/utils/Pair.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/utils/Pair.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/utils/Pair.java Fri Mar 22 11:25:54 2013 +0100
@@ -37,5 +37,13 @@
public B getB() {
return b;
}
+
+ public void setA(A a) {
+ this.a = a;
+ }
+
+ public void setB(B b) {
+ this.b = b;
+ }
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/utils/RiverMapfileGenerator.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-artifacts/src/main/java/de/intevation/flys/utils/RiverMapfileGenerator.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,161 @@
+package de.intevation.flys.utils;
+
+import com.vividsolutions.jts.geom.Envelope;
+import com.vividsolutions.jts.geom.MultiLineString;
+
+import de.intevation.flys.artifacts.model.LayerInfo;
+import de.intevation.flys.artifacts.model.RiverFactory;
+
+import de.intevation.flys.model.River;
+import de.intevation.flys.model.RiverAxis;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+
+import java.util.ArrayList;
+import java.util.List;
+
+import java.util.regex.Pattern;
+
+import org.apache.log4j.Logger;
+
+import org.apache.velocity.Template;
+import org.hibernate.HibernateException;
+
+public class RiverMapfileGenerator extends MapfileGenerator {
+
+ public static final String XPATH_RIVERMAP_RIVER_PROJECTION =
+ "/artifact-database/rivermap/river[@name=$name]/srid/@value";
+
+ public static final String XPATH_RIVERMAP_SHAPEFILE_DIR =
+ "/artifact-database/rivermap/shapefile-path/@value";
+
+ public static final String XPATH_RIVERMAP_VELOCITY_LOGFILE =
+ "/artifact-database/rivermap/velocity/logfile/@path";
+
+ public static final String XPATH_RIVERMAP_MAPSERVER_URL =
+ "/artifact-database/rivermap/mapserver/server/@path";
+
+ public static final String XPATH_RIVERMAP_MAPFILE_PATH =
+ "/artifact-database/rivermap/mapserver/mapfile/@path";
+
+ public static final String XPATH_RIVERMAP_MAPFILE_TEMPLATE =
+ "/artifact-database/rivermap/mapserver/map-template/@path";
+
+ public static final String XPATH_RIVERMAP_MAPSERVER_TEMPLATE_PATH =
+ "/artifact-database/rivermap/mapserver/templates/@path";
+
+ public static final Pattern DB_URL_PATTERN =
+ Pattern.compile("(.*)\\/\\/(.*):([0-9]+)\\/([a-zA-Z]+)");
+
+ public static final Pattern DB_PSQL_URL_PATTERN =
+ Pattern.compile("(.*)\\/\\/(.*):([0-9]+)\\/([a-zA-Z0-9]+)");
+
+ private static Logger logger = Logger.getLogger(RiverMapfileGenerator.class);
+
+ /**
+ * Generate river axis mapfile.
+ */
+ @Override
+ public void generate() {
+ logger.debug("generate()");
+
+ List<River> rivers = RiverFactory.getRivers();
+ List<String> riverFiles = new ArrayList<String>();
+
+ for (River river : rivers) {
+ // We expect that every river has only one RiverAxis.
+ // This is not correct but currently the case here, see
+ // RiverAxis.java.
+ List<RiverAxis> riverAxis = null;
+ try {
+ riverAxis = RiverAxis.getRiverAxis(river.getName());
+ }
+ catch (HibernateException iae) {
+ logger.error("No valid riveraxis found for " + river.getName());
+ continue;
+ }
+
+ if (riverAxis == null) {
+ logger.warn("River " + river.getName() + " has no river axis!");
+ continue;
+ }
+ if (riverAxis.get(0).getGeom() == null) {
+ logger.warn("River " + river.getName() +
+ " has no riveraxis geometry!");
+ continue;
+ }
+ MultiLineString geom = riverAxis.get(0).getGeom();
+ Envelope extent = geom.getEnvelopeInternal();
+
+ createRiverAxisLayer(
+ river.getName(),
+ river.getId(),
+ Integer.toString(geom.getSRID()),
+ extent.getMinX() + " " +
+ extent.getMinY() + " " +
+ extent.getMaxX() + " " +
+ extent.getMaxY());
+
+ riverFiles.add("river-" + river.getName() + ".map");
+ }
+ writeMapfile(riverFiles);
+ }
+
+ protected void createRiverAxisLayer(String riverName, int riverID, String srid, String extend) {
+ LayerInfo layerInfo = new LayerInfo();
+ layerInfo.setName(riverName);
+ layerInfo.setConnection(MapUtils.getConnection());
+ layerInfo.setConnectionType(MapUtils.getConnectionType());
+ layerInfo.setSrid(srid);
+ layerInfo.setExtent(extend);
+ layerInfo.setType("line");
+ // FIXME: Use templates for that
+ if (FLYSUtils.isUsingOracle()) {
+ layerInfo.setData("geom FROM river_axes USING SRID " + srid);
+ } else {
+ layerInfo.setData("geom FROM river_axes");
+ }
+ layerInfo.setFilter("river_id = " + riverID);
+ layerInfo.setTitle(riverName + " RiverAxis");
+
+ File layerFile = new File("river-" + riverName + ".map");
+ Template template = getTemplateByName("riveraxis-layer.vm");
+ if (template == null) {
+ logger.warn("Template riveraxis-layer.vm not found.");
+ return;
+ }
+
+ try {
+ writeLayer(layerInfo, layerFile, template);
+ }
+ catch (FileNotFoundException e) {
+ logger.warn(e.getLocalizedMessage(), e);
+ }
+ }
+
+ @Override
+ protected String getVelocityLogfile() {
+ return FLYSUtils.getXPathString(XPATH_RIVERMAP_VELOCITY_LOGFILE);
+ }
+
+ @Override
+ protected String getMapserverTemplatePath() {
+ return FLYSUtils.getXPathString(XPATH_RIVERMAP_MAPSERVER_TEMPLATE_PATH);
+ }
+
+ @Override
+ public String getMapserverUrl() {
+ return FLYSUtils.getXPathString(XPATH_RIVERMAP_MAPSERVER_URL);
+ }
+
+ @Override
+ protected String getMapfilePath() {
+ return FLYSUtils.getXPathString(XPATH_RIVERMAP_MAPFILE_PATH);
+ }
+
+ @Override
+ protected String getMapfileTemplate() {
+ return FLYSUtils.getXPathString(XPATH_RIVERMAP_MAPFILE_TEMPLATE);
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/java/de/intevation/flys/wsplgen/FacetCreator.java
--- a/flys-artifacts/src/main/java/de/intevation/flys/wsplgen/FacetCreator.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/java/de/intevation/flys/wsplgen/FacetCreator.java Fri Mar 22 11:25:54 2013 +0100
@@ -159,8 +159,10 @@
hash,
getUrl());
- barriers.addLayer(
- MapfileGenerator.MS_PREFIX_WSPLGEN + artifact.identifier());
+ barriers.addLayer(MapfileGenerator.MS_LAYER_PREFIX +
+ MapfileGenerator.MS_PREFIX_WSPLGEN + "lines" + artifact.identifier());
+ barriers.addLayer( MapfileGenerator.MS_LAYER_PREFIX +
+ MapfileGenerator.MS_PREFIX_WSPLGEN + "poly" + artifact.identifier());
barriers.setSrid(getSrid());
barriers.setExtent(getBounds());
@@ -168,13 +170,17 @@
}
- public void createUserShapeFacet() {
+ public void createShapeFacet(
+ String desc,
+ String layer,
+ String type,
+ int ndx) {
WMSLayerFacet shape = new WMSLayerFacet(
1,
- FLOODMAP_USERSHAPE,
+ type,
Resources.getMsg(
cc.getMeta(),
- I18N_USERSHAPE,
+ desc,
I18N_USERSHAPE_DEFAULT),
ComputeType.ADVANCE,
stateId,
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/resources/messages.properties
--- a/flys-artifacts/src/main/resources/messages.properties Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/resources/messages.properties Fri Mar 22 11:25:54 2013 +0100
@@ -15,8 +15,11 @@
state.winfo.uesk.floodplain = Lateral Boundary
state.winfo.uesk.differences = Differenzen between waterlevel and terrain
state.winfo.uesk.scenario = Flood Plain / Scenario
+state.winfo.uesk.dc-hws = HWS
+state.winfo.uesk.user-rgd = User defined shapefiles
+state.winfo.uesk.barriers = Digitized HWS
state.winfo.waterlevel_pair_select = Chosen differences
-state.winfo.historicalq.reference_gauge = Selection of Reference Gauge
+state.winfo.historicalq.reference_gauge = Selection of Gauge
state.winfo.historicalq.timerange = Selection of Evaluation time
state.winfo.historicalq.mode = Selecion of analyses
state.winfo.reference.curve.input.start = Chosen Reference
@@ -71,8 +74,8 @@
historical.mode.w = Waterlevel Analyse
historical.mode.q = Discharge Analyse
-historical_discharge.wq.curve_range = Discharge Curve {0,date,short} - {1,date,short}
-historical_discharge.wq.curve_since = Discharge Curve {0,date,short}
+historical_discharge.wq.curve_range = Discharge Curve {0,date,medium} - {1,date,medium}
+historical_discharge.wq.curve_since = Discharge Curve {0,date,medium}
calc.surface.curve = Water Level/Surface Curve
calc.flood.map = Flood Plain
@@ -151,7 +154,7 @@
chart.cross_section.title = Cross Section for river {0}
chart.cross_section.subtitle = {0}-km: {1,number,#.###}
chart.cross_section.xaxis.label = Distance [m]
-chart.cross_section.yaxis.label = W [NN + m]
+chart.cross_section.yaxis.label = W [{0}]
chart.discharge.curve.title = Discharge Curve
chart.discharge.curve.xaxis.label = Q [m\u00b3/s]
@@ -301,7 +304,7 @@
export.waterlevel.csv.header.km = River-Km
-export.waterlevel.csv.header.w = W [NN + m]
+export.waterlevel.csv.header.w = W [{0}]
export.waterlevel.csv.header.q = Q [m\u00b3/s]
export.waterlevel.csv.header.q.desc = Description
export.waterlevel.csv.header.location = Location
@@ -329,8 +332,8 @@
export.historical.discharge.csv.header.timerange = Timerange
export.historical.discharge.csv.header.waterlevel = Waterlevel [cm]
export.historical.discharge.csv.header.discharge = Discharge [m\u00b3/s]
-export.historical.discharge.csv.header.diff = Difference [m\u00b3/s]
-export.historical.discharge.csv.header.gaugename = Gaugename
+export.historical.discharge.csv.header.diff = \u0394Q to reference[m\u00b3/s]
+export.historical.discharge.csv.header.gaugename = Gauge
export.reference_curve.csv.header.km = km
export.reference_curve.csv.header.w.cm = W (cm at Gauge)
export.reference_curve.csv.header.w.m = W (m + NHN)
@@ -517,7 +520,7 @@
fix.analysis=Analysis event
fix.deviation=Standard deviation
fix.reference.deviation=Reference deviation
-fix.vollmer.wq.curve=W/Q
+fix.vollmer.wq.curve=Adjusted function
fix.vollmer.wq.outliers=Outliers
fix.vollmer.wq.events=Events
qsectors=Discharge Sectors
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/resources/messages_de.properties
--- a/flys-artifacts/src/main/resources/messages_de.properties Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/resources/messages_de.properties Fri Mar 22 11:25:54 2013 +0100
@@ -15,8 +15,11 @@
state.winfo.uesk.floodplain = Laterale Begrenzung
state.winfo.uesk.differences = Differenzen zwischen Wasserspiegellage und Gel\u00e4nde
state.winfo.uesk.scenario = \u00dcberschwemmungsfl\u00e4che / Szenario
+state.winfo.uesk.dc-hws = Hochwasserschutzanlagen
+state.winfo.uesk.user-rgd = Benutzerdefinierte Shapefiles
+state.winfo.uesk.barriers = Benutzerdefinierte Hochwasserschutzanlagen
state.winfo.waterlevel_pair_select = Ausgew\u00e4hlte Differenzen
-state.winfo.historicalq.reference_gauge = Wahl des Bezugspegels
+state.winfo.historicalq.reference_gauge = Wahl des Pegels
state.winfo.historicalq.timerange = Wahl des Auswertezeitraums
state.winfo.historicalq.mode = Wahl der Analyseart
state.winfo.reference.curve.input.start = Bezugsort
@@ -72,8 +75,8 @@
historical.mode.w = Wasserstandsanalyse
historical.mode.q = Abflussanalyse
-historical_discharge.wq.curve_range = Abflusskurve {0,date,short} - {1,date,short}
-historical_discharge.wq.curve_since = Abflusskurve ab {0,date,short}
+historical_discharge.wq.curve_range = Abflusskurve {0,date,medium} - {1,date,medium}
+historical_discharge.wq.curve_since = Abflusskurve ab {0,date,medium}
calc.surface.curve = Wasserstand/Wasserspiegellage
calc.flood.map = \u00dcberschwemmungsfl\u00e4che
@@ -144,7 +147,7 @@
chart.cross_section.title = Querprofildiagramm f\u00fcr Gew\u00e4sser {0}
chart.cross_section.subtitle = {0}-km: {1,number,#.###}
chart.cross_section.xaxis.label = Abstand [m]
-chart.cross_section.yaxis.label = W [NN + m]
+chart.cross_section.yaxis.label = W [{0}]
chart.longitudinal.section.title = W-L\u00e4ngsschnitt
chart.longitudinal.section.subtitle = Bereich: {0}-km {1,number,#.###} - {2,number,#.###}
@@ -291,7 +294,7 @@
chart.beddifference.yaxis.label.heights = Absolute H\u00f6he [m]
export.waterlevel.csv.header.km = Fluss-Km
-export.waterlevel.csv.header.w = W [NN + m]
+export.waterlevel.csv.header.w = W [{0}]
export.waterlevel.csv.header.q = Q [m\u00b3/s]
export.waterlevel.csv.header.q.desc = Bezeichnung
export.waterlevel.csv.header.location = Lage
@@ -319,8 +322,8 @@
export.historical.discharge.csv.header.timerange = Zeitraum
export.historical.discharge.csv.header.waterlevel = Wasserstand [cm]
export.historical.discharge.csv.header.discharge = Abfluss [m\u00b3/s]
-export.historical.discharge.csv.header.diff = Abflussdifferenz zur Bezugskurve [m\u00b3/s]
-export.historical.discharge.csv.header.gaugename = Pegelname
+export.historical.discharge.csv.header.diff = \u0394Q zur Bezugskurve[m\u00b3/s]
+export.historical.discharge.csv.header.gaugename = Pegel
export.reference_curve.csv.header.km = km
export.reference_curve.csv.header.w.cm = W (cm am Pegel)
export.reference_curve.csv.header.w.m = W (m + NHN)
@@ -471,6 +474,9 @@
help.state.winfo.uesk.floodplain=https://flys-intern.intevation.de/Flys-3.0/OnlineHilfe/WINFO#help.state.winfo.uesk.floodplain
help.state.winfo.uesk.differences=https://flys-intern.intevation.de/Flys-3.0/OnlineHilfe/WINFO#help.state.winfo.uesk.differences
help.state.winfo.uesk.scenario=https://flys-intern.intevation.de/Flys-3.0/OnlineHilfe/WINFO#help.state.winfo.uesk.scenario
+help.state.winfo.uesk.dc-hws=https://flys-intern.intevation.de/Flys-3.0/OnlineHilfe/WINFO#help.state.winfo.uesk.scenario
+help.state.winfo.uesk.user-rgd=https://flys-intern.intevation.de/Flys-3.0/OnlineHilfe/WINFO#help.state.winfo.uesk.scenario
+help.state.winfo.uesk.barriers=https://flys-intern.intevation.de/Flys-3.0/OnlineHilfe/WINFO#help.state.winfo.uesk.scenario
help.state.winfo.historicalq.reference_gauge=https://flys-intern.intevation.de/Flys-3.0/OnlineHilfe/WINFO#help.state.winfo.historicalq.reference_gauge
help.state.winfo.historicalq.timerange=https://flys-intern.intevation.de/Flys-3.0/OnlineHilfe/WINFO#help.state.winfo.historicalq.timerange
help.state.winfo.historicalq.mode=https://flys-intern.intevation.de/Flys-3.0/OnlineHilfe/WINFO#help.state.winfo.historicalq.mode
@@ -509,7 +515,7 @@
fix.analysis=Analyseereignis
fix.deviation=Standardabweichung
fix.reference.deviation=Abweichung im Bezugszeitraum
-fix.vollmer.wq.curve=W/Q
+fix.vollmer.wq.curve=Angepasste Funktion
fix.vollmer.wq.outliers=Ausrei\u00dfer
fix.vollmer.wq.events=Ereignisse
qsectors=Abfluss-Sektoren
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/resources/messages_de_DE.properties
--- a/flys-artifacts/src/main/resources/messages_de_DE.properties Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/resources/messages_de_DE.properties Fri Mar 22 11:25:54 2013 +0100
@@ -15,8 +15,11 @@
state.winfo.uesk.floodplain = Laterale Begrenzung
state.winfo.uesk.differences = Differenzen zwischen Wasserspiegellage und Gel\u00e4nde
state.winfo.uesk.scenario = \u00dcberschwemmungsfl\u00e4che / Szenario
+state.winfo.uesk.dc-hws = Hochwasserschutzanlagen
+state.winfo.uesk.user-rgd = Benutzerdefinierte Shapefiles
+state.winfo.uesk.barriers = Benutzerdefinierte Hochwasserschutzanlagen
state.winfo.waterlevel_pair_select = Ausgew\u00e4hlte Differenzen
-state.winfo.historicalq.reference_gauge = Wahl des Bezugspegels
+state.winfo.historicalq.reference_gauge = Wahl des Pegels
state.winfo.historicalq.timerange = Wahl des Auswertezeitraums
state.winfo.historicalq.mode = Wahl der Analyseart
state.winfo.reference.curve.input.start = Bezugsort
@@ -73,8 +76,8 @@
historical.mode.w = Wasserstandsanalyse
historical.mode.q = Abflussanalyse
-historical_discharge.wq.curve_range = Abflusskurve {0,date,short} - {1,date,short}
-historical_discharge.wq.curve_since = Abflusskurve ab {0,date,short}
+historical_discharge.wq.curve_range = Abflusskurve {0,date,medium} - {1,date,medium}
+historical_discharge.wq.curve_since = Abflusskurve ab {0,date,medium}
calc.surface.curve = Wasserstand/Wasserspiegellage
calc.flood.map = \u00dcberschwemmungsfl\u00e4che
@@ -145,7 +148,7 @@
chart.cross_section.title = Querprofildiagramm f\u00fcr Gew\u00e4sser {0}
chart.cross_section.subtitle = {0}-km: {1,number,#.###}
chart.cross_section.xaxis.label = Abstand [m]
-chart.cross_section.yaxis.label = W [NN + m]
+chart.cross_section.yaxis.label = W [{0}]
chart.longitudinal.section.title = W-L\u00e4ngsschnitt
chart.longitudinal.section.subtitle = Bereich: {0}-km {1,number,#.###} - {2,number,#.###}
@@ -300,7 +303,7 @@
chart.beddifference.yaxis.label.heights = Absolute H\u00f6he [m]
export.waterlevel.csv.header.km = Fluss-Km
-export.waterlevel.csv.header.w = W [NN + m]
+export.waterlevel.csv.header.w = W [{0}]
export.waterlevel.csv.header.q = Q [m\u00b3/s]
export.waterlevel.csv.header.q.desc = Bezeichnung
export.waterlevel.csv.header.location = Lage
@@ -328,8 +331,8 @@
export.historical.discharge.csv.header.timerange = Zeitraum
export.historical.discharge.csv.header.waterlevel = Wasserstand [cm]
export.historical.discharge.csv.header.discharge = Abfluss [m\u00b3/s]
-export.historical.discharge.csv.header.diff = Abflussdifferenz zur Bezugskurve [m\u00b3/s]
-export.historical.discharge.csv.header.gaugename = Pegelname
+export.historical.discharge.csv.header.diff = \u0394Q zur Bezugskurve[m\u00b3/s]
+export.historical.discharge.csv.header.gaugename = Pegel
export.reference_curve.csv.header.km = km
export.reference_curve.csv.header.w.cm = W (cm am Pegel)
export.reference_curve.csv.header.w.m = W (m + NHN)
@@ -517,7 +520,7 @@
fix.analysis=Analyseereignis
fix.deviation=Standardabweichung
fix.reference.deviation=Abweichung im Bezugszeitraum
-fix.vollmer.wq.curve=W/Q
+fix.vollmer.wq.curve=Angepasste Funktion
fix.vollmer.wq.outliers=Ausrei\u00dfer
fix.vollmer.wq.events=Ereignisse
qsectors=Abfluss-Sektoren
diff -r cfc5540a4eec -r 61bf64b102bc flys-artifacts/src/main/resources/messages_en.properties
--- a/flys-artifacts/src/main/resources/messages_en.properties Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-artifacts/src/main/resources/messages_en.properties Fri Mar 22 11:25:54 2013 +0100
@@ -15,8 +15,11 @@
state.winfo.uesk.floodplain = Lateral Boundary
state.winfo.uesk.differences = Differences between waterlevel and terrain
state.winfo.uesk.scenario = Flood Plain / Scenario
+state.winfo.uesk.dc-hws = HWS
+state.winfo.uesk.user-rgd = User defined shapefiles
+state.winfo.uesk.barriers = Digitized HWS
state.winfo.waterlevel_pair_select = Chosen Differences
-state.winfo.historicalq.reference_gauge = Selection of Reference Gauge
+state.winfo.historicalq.reference_gauge = Selection of Gauge
state.winfo.historicalq.timerange = Selection of Evaluation time
state.winfo.historicalq.mode = Selecion of analyses
state.winfo.reference.curve.input.start = Chosen Reference
@@ -71,8 +74,8 @@
historical.mode.w = Waterlevel Analyse
historical.mode.q = Discharge Analyse
-historical_discharge.wq.curve_range = Discharge Curve {0,date,short} - {1,date,short}
-historical_discharge.wq.curve_since = Discharge Curve {0,date,short}
+historical_discharge.wq.curve_range = Discharge Curve {0,date,medium} - {1,date,medium}
+historical_discharge.wq.curve_since = Discharge Curve {0,date,medium}
calc.surface.curve = Water Level/Surface Curve
calc.flood.map = Flood Plain
@@ -143,7 +146,7 @@
chart.cross_section.title = Cross Section for river {0}
chart.cross_section.subtitle = {0}-km: {1,number,#.###}
chart.cross_section.xaxis.label = Distance [m]
-chart.cross_section.yaxis.label = W [NN + m]
+chart.cross_section.yaxis.label = W [{0}]
chart.longitudinal.section.title = W-Longitudinal Section
chart.longitudinal.section.subtitle = Range: {0}-km {1,number,#.###} - {2,number,#.###}
@@ -303,7 +306,7 @@
chart.beddifference.yaxis.label.heights = Absolute Height [m]
export.waterlevel.csv.header.km = River-Km
-export.waterlevel.csv.header.w = W [NN + m]
+export.waterlevel.csv.header.w = W [{0}]
export.waterlevel.csv.header.q = Q [m\u00b3/s]
export.waterlevel.csv.header.q.desc = Description
export.waterlevel.csv.header.location = Location
@@ -331,8 +334,8 @@
export.historical.discharge.csv.header.timerange = Timerange
export.historical.discharge.csv.header.waterlevel = Waterlevel [cm]
export.historical.discharge.csv.header.discharge = Discharge [m\u00b3/s]
-export.historical.discharge.csv.header.diff = Difference [m\u00b3/s]
-export.historical.discharge.csv.header.gaugename = Gaugename
+export.historical.discharge.csv.header.diff = \u0394Q to reference[m\u00b3/s]
+export.historical.discharge.csv.header.gaugename = Gauge
export.reference_curve.csv.header.km = km
export.reference_curve.csv.header.w.cm = W (cm at Gauge)
export.reference_curve.csv.header.w.m = W (m + NHN)
@@ -520,7 +523,7 @@
fix.analysis=Analysis event
fix.deviation=Standard deviation
fix.reference.deviation=Reference deviation
-fix.vollmer.wq.curve=W/Q
+fix.vollmer.wq.curve=Adjusted function
fix.vollmer.wq.outliers=Outliers
fix.vollmer.wq.events=Events
qsectors=Discharge Sectors
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/README
--- a/flys-backend/README Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/README Fri Mar 22 11:25:54 2013 +0100
@@ -9,9 +9,17 @@
$ createuser --no-createrole --no-superuser --pwprompt --no-createdb flys
$ createdb --encoding=UTF-8 --owner flys flystest1
+
+Build an importer package:
+mvn -f pom.xml clean compile assembly:single
+Alternatively with oracle:
+mvn -f pom-oracle.xml clean compile assembly:single
+
Standalone DateGuesser testing example:
mvn -e -Dexec.mainClass=de.intevation.flys.utils.DateGuesser -Dexec.args="110803" exec:java <<EOF
110803
EOF
+Some importer allow standalone usage:
+mvn -e -Dexec.mainClass=de.intevation.flys.importer.parsers.AtFileParser -Dexec.args=/home/felix/.bashrc exec:java
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/hws_schema.diff
--- a/flys-backend/contrib/hws_schema.diff Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,128 +0,0 @@
-diff -r 0bb0525bca52 flys-backend/doc/schema/postgresql-drop-spatial.sql
---- a/flys-backend/doc/schema/postgresql-drop-spatial.sql Fri Jan 25 15:38:34 2013 +0100
-+++ b/flys-backend/doc/schema/postgresql-drop-spatial.sql Fri Jan 25 15:42:05 2013 +0100
-@@ -27,8 +27,14 @@
- DROP TABLE catchment;
- DROP SEQUENCE CATCHMENT_ID_SEQ;
-
--DROP TABLE hws;
--DROP SEQUENCE HWS_ID_SEQ;
-+--DROP TABLE hws;
-+--DROP SEQUENCE HWS_ID_SEQ;
-+
-+DROP TABLE hws_points;
-+DROP SEQUENCE HWS_POINTS_ID_SEQ;
-+
-+DROP TABLE hws_lines;
-+DROP SEQUENCE HWS_LINES_ID_SEQ;
-
- DROP TABLE floodmaps;
- DROP SEQUENCE FLOODMAPS_ID_SEQ;
-@@ -42,4 +48,7 @@
- DROP TABLE gauge_location;
- DROP SEQUENCE GAUGE_LOCATION_ID_SEQ;
-
-+DROP TABLE fed_states;
-+DROP TABLE hws_kinds;
-+
- COMMIT;
-diff -r 0bb0525bca52 flys-backend/doc/schema/postgresql-spatial.sql
---- a/flys-backend/doc/schema/postgresql-spatial.sql Fri Jan 25 15:38:34 2013 +0100
-+++ b/flys-backend/doc/schema/postgresql-spatial.sql Fri Jan 25 15:42:05 2013 +0100
-@@ -132,20 +132,87 @@
- SELECT AddGeometryColumn('catchment','geom',31467,'POLYGON',2);
- ALTER TABLE catchment ALTER COLUMN id SET DEFAULT NEXTVAL('CATCHMENT_ID_SEQ');
-
-+-- Static lookup tables for Hochwasserschutzanlagen
-+CREATE TABLE hws_kinds (
-+ id int PRIMARY KEY NOT NULL,
-+ kind VARCHAR(64) NOT NULL
-+);
-+INSERT INTO hws_kinds (id, kind) VALUES (1, 'Durchlass');
-+INSERT INTO hws_kinds (id, kind) VALUES (2, 'Damm');
-+INSERT INTO hws_kinds (id, kind) VALUES (3, 'Graben');
-
----Hydrologie/HW-Schutzanlagen/hws.shp
--CREATE SEQUENCE HWS_ID_SEQ;
--CREATE TABLE hws (
-+CREATE TABLE fed_states (
- id int PRIMARY KEY NOT NULL,
-+ name VARCHAR(23) NOT NULL
-+);
-+INSERT INTO fed_states (id, name) VALUES (1, 'Bayern');
-+INSERT INTO fed_states (id, name) VALUES (2, 'Hessen');
-+INSERT INTO fed_states (id, name) VALUES (3, 'Niedersachsen');
-+INSERT INTO fed_states (id, name) VALUES (4, 'Nordrhein-Westfalen');
-+INSERT INTO fed_states (id, name) VALUES (5, 'Rheinland-Pfalz');
-+INSERT INTO fed_states (id, name) VALUES (6, 'Saarland');
-+INSERT INTO fed_states (id, name) VALUES (7, 'Schleswig-Holstein');
-+INSERT INTO fed_states (id, name) VALUES (8, 'Brandenburg');
-+INSERT INTO fed_states (id, name) VALUES (9, 'Mecklenburg-Vorpommern');
-+INSERT INTO fed_states (id, name) VALUES (10, 'Thüringen');
-+INSERT INTO fed_states (id, name) VALUES (11, 'Baden-Württemberg');
-+INSERT INTO fed_states (id, name) VALUES (12, 'Sachsen-Anhalt');
-+INSERT INTO fed_states (id, name) VALUES (13, 'Sachsen');
-+INSERT INTO fed_states (id, name) VALUES (14, 'Berlin');
-+INSERT INTO fed_states (id, name) VALUES (15, 'Bremen');
-+INSERT INTO fed_states (id, name) VALUES (16, 'Hamburg');
-+
-+--Hydrologie/HW-Schutzanlagen/*Linien.shp
-+CREATE SEQUENCE HWS_LINES_ID_SEQ;
-+CREATE TABLE hws_lines (
-+ id int PRIMARY KEY NOT NULL,
-+ ogr_fid int,
-+ kind_id int REFERENCES hws_kinds(id) DEFAULT 2,
-+ fed_state_id int REFERENCES fed_states(id),
- river_id int REFERENCES rivers(id),
-- hws_facility VARCHAR(256),
-- type VARCHAR(256),
-- name VARCHAR(64),
-- path VARCHAR(256)
-+ name VARCHAR(256),
-+ path VARCHAR(256),
-+ offical INT DEFAULT 0,
-+ agency VARCHAR(256),
-+ range VARCHAR(256),
-+ shore_side INT DEFAULT 0,
-+ source VARCHAR(256),
-+ status_date TIMESTAMP,
-+ description VARCHAR(256)
- );
--SELECT AddGeometryColumn('hws','geom',31467,'LINESTRING',2);
--ALTER TABLE hws ALTER COLUMN id SET DEFAULT NEXTVAL('HWS_ID_SEQ');
-+SELECT AddGeometryColumn('hws_lines', 'geom', 31467, 'LINESTRING', 2);
-+SELECT AddGeometryColumn('hws_lines', 'geom_target', 31467, 'LINESTRING', 2); -- ?
-+SELECT AddGeometryColumn('hws_lines', 'geom_rated_level', 31467, 'LINESTRING', 2); -- ?
-+-- TODO: dike_km_from dike_km_to, are they geometries?
-
-+ALTER TABLE hws_lines ALTER COLUMN id SET DEFAULT NEXTVAL('HWS_LINES_ID_SEQ');
-+
-+--Hydrologie/HW-Schutzanlagen/*Punkte.shp
-+CREATE SEQUENCE HWS_POINTS_ID_SEQ;
-+CREATE TABLE hws_points (
-+ id int PRIMARY KEY NOT NULL,
-+ ogr_fid int,
-+ kind_id int REFERENCES hws_kinds(id) DEFAULT 2,
-+ fed_state_id int REFERENCES fed_states(id),
-+ river_id int REFERENCES rivers(id),
-+ name VARCHAR,
-+ path VARCHAR,
-+ offical INT DEFAULT 0,
-+ agency VARCHAR,
-+ range VARCHAR,
-+ shore_side INT DEFAULT 0,
-+ source VARCHAR,
-+ status_date VARCHAR,
-+ description VARCHAR,
-+ freeboard FLOAT8,
-+ dike_km FLOAT8,
-+ z FLOAT8,
-+ z_target FLOAT8,
-+ rated_level FLOAT8
-+);
-+SELECT AddGeometryColumn('hws_points', 'geom', 31467, 'POINT', 2);
-+
-+ALTER TABLE hws_points ALTER COLUMN id SET DEFAULT NEXTVAL('HWS_POINTS_ID_SEQ');
-
- --
- --Hydrologie/UeSG
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/import-gew.py
--- a/flys-backend/contrib/import-gew.py Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,223 +0,0 @@
-#!/usr/bin/env python
-# -*- coding: utf-8 -*-
-
-import sys
-import os
-import codecs
-import re
-
-HAUPTWERT = re.compile(r"\s*([^\s]+)\s+([^\s+]+)\s+([QWDT-])")
-WHITESPACE = re.compile(r"\s+")
-
-class KM(object):
-
- def __init__(self, filename):
- self.filename = filename
- self.load_values()
-
- def load_values(self):
- with codecs.open(self.filename, "r", "latin-1") as f:
- for line in f:
- line = line.strip()
- if not line or line.startswith("*"):
- parts = [s.strip() for s in line.split(";")]
- # TODO: Use code from import-kms.py
-
-class AbflussTafel(object):
-
- def __init__(self, filename):
- self.filename = filename
- self.name = ""
- self.values = []
- self.load_values()
-
- def load_values(self):
- with codecs.open(self.filename, "r", "latin-1") as f:
- first = True
- for line in f:
- line = line.strip()
- if not line: continue
- if line.startswith("#! name="):
- self.name = line[8:]
- continue
- if line.startswith("#") or line.startswith("*"):
- continue
- line = line.replace(",", ".")
- splits = WHITESPACE.split(line)
-
- if len(splits) < 2 or len(splits) > 11:
- continue
-
- w = float(splits[0])
-
- shift = 0
-
- if len(splits) != 11 and first:
- shift = 11 - len(splits)
-
- for idx, q in enumerate(splits[1:]):
- i_w = w + shift + idx
- i_q = float(q)
- w_q = (i_w/100.0, i_q/100.0)
- self.values.append(w_q)
-
- first = False
-
-
-class Hauptwert(object):
- def __init__(self, name, value, kind):
- self.name = name
- self.extra = value
- self.kind = kind
-
-class Pegel(object):
- def __init__(self, name, start, stop, sta, at, html):
- self.name = name
- self.start = start
- self.stop = stop
- self.sta = sta
- self.at = at
- self.html = html
- self.aeo = 0.0
- self.nullpunkt = 0.0
- self.km = 0.0
- self.hauptwerte = []
- self.load_hauptwerte()
- self.at_data = AbflussTafel(self.at)
-
- def load_hauptwerte(self):
- with codecs.open(self.sta, "r", "latin-1") as f:
- for line_no, line in enumerate(f):
- line = line.rstrip()
- if line_no == 0:
- first = False
- name = line[16:37].strip()
- line = [s.replace(",", ".") for s in line[37:].split()]
- self.aeo = float(line[0])
- self.nullpunkt = float(line[1])
- print >> sys.stderr, "pegel name: '%s'" % name
- print >> sys.stderr, "pegel aeo: '%f'" % self.aeo
- print >> sys.stderr, "pegel nullpunkt: '%f'" % self.nullpunkt
- elif line_no == 1:
- self.km = float(line[29:36].strip().replace(",", "."))
- print >> sys.stderr, "km: '%f'" % self.km
- else:
- if not line: continue
- line = line.replace(",", ".")
- m = HAUPTWERT.match(line)
- if not m: continue
- self.hauptwerte.append(Hauptwert(
- m.group(1), float(m.group(2)), m.group(3)))
-
-class Gewaesser(object):
-
- def __init__(self, name=None, b_b=None, wst=None):
- self.name = name
- self.b_b = b_b
- self.wst = wst
- self.pegel = []
-
- def load_pegel(self):
- dir_name = os.path.dirname(self.wst)
- pegel_glt = find_file(dir_name, "PEGEL.GLT")
- if not pegel_glt:
- print >> sys.stderr, "Missing PEGEL.GLT for %r" % self.name
- return
-
- print >> sys.stderr, "pegel_glt: %r" % pegel_glt
-
- with codecs.open(pegel_glt, "r", "latin-1") as f:
- for line in f:
- line = line.strip()
- if not line or line.startswith("#"):
- continue
- # using re to cope with quoted columns,
- # shlex has unicode problems.
- parts = [p for p in re.split("( |\\\".*?\\\"|'.*?')", line)
- if p.strip()]
- if len(parts) < 7:
- print >> sys.stderr, "too less colums (need 7): %r" % line
- continue
-
- print >> sys.stderr, "%r" % parts
- self.pegel.append(Pegel(
- parts[0],
- min(float(parts[2]), float(parts[3])),
- max(float(parts[2]), float(parts[3])),
- norm_path(parts[4], dir_name),
- norm_path(parts[5], dir_name),
- parts[6]))
-
-
- def __repr__(self):
- return u"Gewaesser(name=%r, b_b=%r, wst=%r)" % (
- self.name, self.b_b, self.wst)
-
-def norm_path(path, ref):
- if not os.path.isabs(path):
- path = os.path.normpath(os.path.join(ref, path))
- return path
-
-def find_file(path, what):
- what = what.lower()
- for filename in os.listdir(path):
- p = os.path.join(path, filename)
- if os.path.isfile(p) and filename.lower() == what:
- return p
- return None
-
-
-def read_gew(filename):
-
- gewaesser = []
-
- current = Gewaesser()
-
- filename = os.path.abspath(filename)
- dirname = os.path.dirname(filename)
-
- with codecs.open(filename, "r", "latin-1") as f:
- for line in f:
- line = line.strip()
- if not line or line.startswith("*"):
- continue
-
- if line.startswith(u"Gewässer:"):
- if current.name:
- gewaesser.append(current)
- current = Gewaesser()
- current.name = line[len(u"Gewässer:"):].strip()
- elif line.startswith(u"B+B-Info:"):
- current.b_b = norm_path(line[len(u"B+B-Info:"):].strip(),
- dirname)
- elif line.startswith(u"WSTDatei:"):
- current.wst = norm_path(line[len(u"WSTDatei:"):].strip(),
- dirname)
-
- if current.name:
- gewaesser.append(current)
-
- return gewaesser
-
-def main():
-
- if len(sys.argv) < 2:
- print >> sys.stderr, "missing gew file"
- sys.exit(1)
-
- gew_filename = sys.argv[1]
-
- if not os.path.isfile(gew_filename):
- print >> sys.stderr, "'%s' is not a file" % gew_filename
- sys.exit(1)
-
- gewaesser = read_gew(gew_filename)
-
- for gew in gewaesser:
- gew.load_pegel()
-
-
-
-if __name__ == '__main__':
- main()
-# vim: set fileencoding=utf-8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/import-kms.py
--- a/flys-backend/contrib/import-kms.py Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,213 +0,0 @@
-#!/usr/bin/env python
-
-import sys
-import logging
-import re
-import os
-
-import sqlite3 as db
-import locale
-import codecs
-
-from optparse import OptionParser
-
-log = logging.getLogger(__name__)
-log.setLevel(logging.WARNING)
-log.addHandler(logging.StreamHandler(sys.stderr))
-
-RANGE = re.compile("([^#]*)#(.*)")
-DEFAULT_DATABASE = "flys.db"
-
-SQL_NEXT_ID = "SELECT coalesce(max(id), -1) + 1 FROM %s"
-SQL_SELECT_ID = "SELECT id FROM %s WHERE %s = ?"
-SQL_INSERT_ID = "INSERT INTO %s (id, %s) VALUES (?, ?)"
-
-SQL_SELECT_RANGE_ID = """
-SELECT id FROM ranges WHERE river_id = ? AND a = ? AND b = ?
-"""
-SQL_INSERT_RANGE_ID = """
-INSERT INTO ranges (id, river_id, a, b) VALUES (?, ?, ?, ?)
-"""
-SQL_SELECT_ANNOTATION_ID = """
-SELECT id FROM annotations
-WHERE range_id = ? AND attribute_id = ? AND position_id = ?
-"""
-SQL_INSERT_ANNOTATION_ID = """
-INSERT INTO annotations (id, range_id, attribute_id, position_id)
-VALUES (?, ?, ?, ?)
-"""
-
-def encode(s):
- try:
- return unicode(s, "latin-1")
- except UnicodeDecodeError:
- return unicode.encode(s, locale.getpreferredencoding())
-
-class hashabledict(dict):
- def __key(self):
- return tuple((k, self[k]) for k in sorted(self))
- def __hash__(self):
- return hash(self.__key())
- def __eq__(self, other):
- return self.__key() == other.__key()
-
-def cache(f):
- def func(*args, **kw):
- key = (args, hashabledict(kw))
- try:
- return f.__cache__[key]
- except KeyError:
- value = f(*args, **kw)
- f.__cache__[key] = value
- return value
- f.__cache__ = {}
- return func
-
-NEXT_IDS = {}
-def next_id(cur, relation):
- idx = NEXT_IDS.get(relation)
- if idx is None:
- cur.execute(SQL_NEXT_ID % relation)
- idx = cur.fetchone()[0]
- NEXT_IDS[relation] = idx + 1
- return idx
-
-def get_id(cur, relation, attribute, value):
- select_stmt = SQL_SELECT_ID % (relation, attribute)
- #log.debug(select_stmt)
- cur.execute(select_stmt, (value,))
- row = cur.fetchone()
- if row: return row[0]
- idx = next_id(cur, relation)
- insert_stmnt = SQL_INSERT_ID % (relation, attribute)
- #log.debug(insert_stmnt)
- cur.execute(insert_stmnt, (idx, value))
- cur.connection.commit()
- log.debug("insert %s '%s' id: '%d'" % (relation, value, idx))
- return idx
-
-#@cache
-def get_river_id(cur, name):
- return get_id(cur, "rivers", "name", name)
-
-#@cache
-def get_attribute_id(cur, value):
- return get_id(cur, "attributes", "value", value)
-
-#@cache
-def get_position_id(cur, value):
- return get_id(cur, "positions", "value", value)
-
-#@cache
-def get_range_id(cur, river_id, a, b):
- cur.execute(SQL_SELECT_RANGE_ID, (river_id, a, b))
- row = cur.fetchone()
- if row: return row[0]
- idx = next_id(cur, "ranges")
- cur.execute(SQL_INSERT_RANGE_ID, (idx, river_id, a, b))
- cur.connection.commit()
- return idx
-
-#@cache
-def get_annotation_id(cur, range_id, attribute_id, position_id):
- cur.execute(SQL_SELECT_ANNOTATION_ID, (
- range_id, attribute_id, position_id))
- row = cur.fetchone()
- if row: return row[0]
- idx = next_id(cur, "annotations")
- cur.execute(SQL_INSERT_ANNOTATION_ID, (
- idx, range_id, attribute_id, position_id))
- cur.connection.commit()
- return idx
-
-def files(root, accept=lambda x: True):
- if os.path.isfile(root):
- if accept(root): yield root
- elif os.path.isdir(root):
- stack = [ root ]
- while stack:
- cur = stack.pop()
- for f in os.listdir(cur):
- p = os.path.join(cur, f)
- if os.path.isdir(p):
- stack.append(p)
- elif os.path.isfile(p) and accept(p):
- yield p
-
-def feed_km(cur, river_id, km_file):
-
- log.info("processing: %s" % km_file)
-
- for line in codecs.open(km_file, "r", "latin-1"):
- line = line.strip()
- if not line or line.startswith('*'):
- continue
- parts = [x.strip() for x in line.split(';')]
- if len(parts) < 3:
- log.error("cannot process: '%s'" % line)
- continue
- m = RANGE.match(parts[2])
- try:
- if m:
- x = [float(x.replace(",", ".")) for x in m.groups()]
- a, b = min(x), max(x)
- if a == b: b = None
- else:
- a, b = float(parts[2].replace(",", ".")), None
- except ValueError:
- log.error("cannot process: '%s'" % line)
- continue
-
- attribute = parts[0]
- position = parts[1]
- attribute_id = get_attribute_id(cur, attribute) if attribute else None
- position_id = get_position_id(cur, position) if position else None
-
- range_id = get_range_id(cur, river_id, a, b)
-
- get_annotation_id(cur, range_id, attribute_id, position_id)
-
-def main():
-
- usage = "usage: %prog [options] river km-file ..."
- parser = OptionParser(usage=usage)
- parser.add_option(
- "-v", "--verbose", action="store_true",
- dest="verbose",
- help="verbose output")
- parser.add_option(
- "-r", "--recursive", action="store_true",
- dest="recursive", default=False,
- help="recursive")
- parser.add_option(
- "-d", "--database", action="store",
- dest="database",
- help="database to connect with",
- default=DEFAULT_DATABASE)
-
- options, args = parser.parse_args()
-
- if options.verbose:
- log.setLevel(logging.INFO)
-
- if len(args) < 1:
- log.error("missing river argument")
- sys.exit(1)
-
- river = unicode(args[0], locale.getpreferredencoding())
-
- with db.connect(options.database) as con:
- cur = con.cursor()
- river_id = get_river_id(cur, river)
-
- for arg in args[1:]:
- if options.recursive:
- for km_file in files(
- arg, lambda x: x.lower().endswith(".km")):
- feed_km(cur, river_id, km_file)
- else:
- feed_km(cur, river_id, arg)
-
-
-if __name__ == '__main__':
- main()
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/import_river.sh
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/contrib/import_river.sh Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,352 @@
+#!/bin/bash
+# Import script for rivers
+#
+# Authors:
+# Andre Heinecke <aheinecke at intevation.de>
+#
+# Copyright:
+# Copyright (C) 2013 Intevation GmbH
+#
+# This program is free software; you can redistribute it and/or
+# modify it under the terms of the GNU General Public License
+# as published by the Free Software Foundation; either version 2
+# of the License, or (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program; if not, write to the Free Software
+# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
+
+set -e
+
+# Default settings
+DEFAULT_HOST=localhost
+DEFAULT_PORT=1521
+DEFAULT_USER=flys_dami
+DEFAULT_PASS=flys_dami
+DEFAULT_LOG=$PWD/logs
+DEFAULT_BACKEND_NAME="XE"
+JAR="hydr_morph/importer.jar"
+IMPORTER_DRY_RUN=false
+IMPORTER_MAINVALUE_TYPES=QWTD
+IMPORTER_ANNOTATION_TYPES="conf/annotation-types.xml"
+
+
+MIN_MEMORY="8024m"
+
+if [ -z "$OPTIONAL_LIBS" ]; then
+ OPTIONAL_LIBS="$(dirname $0)/opt"
+fi
+
+if [ -d "$OPTIONAL_LIBS" ]; then
+ export PATH="$OPTIONAL_LIBS/bin:$PATH"
+ export LD_LIBRARY_PATH="$OPTIONAL_LIBS/lib:$LD_LIBRARY_PATH"
+ export LD_LIBRARY_PATH="$OPTIONAL_LIBS/lib64:$LD_LIBRARY_PATH"
+ export PYTHONPATH="$OPTIONAL_LIBS/lib/python2.6/site-packages:$PYTHONPATH"
+ export PYTHONPATH="$OPTIONAL_LIBS/lib64/python2.6/site-packages:$PYTHONPATH"
+ export GDAL_DATA="$OPTIONAL_LIBS/share/gdal"
+fi
+
+usage(){
+ cat << EOF
+
+usage: $0 [options] gew_file
+
+Import a river described by the gew_file
+
+OPTIONS:
+ -?, --help Show this message
+ -u, --username=<username> Database username. Default: $DEFAULT_USER
+ -w, --password=<password> Database password. Default: $DEFAULT_PASS
+ -h, --host=<host> Connect to database on host <host>.
+ Default: $DEFAULT_HOST
+ -p, --port=<number> Use port number <number>. Default: $DEFAULT_PORT
+ -d, --db-name=<database_name> Name of the database / backend. Default: $DEFAULT_BACKEND_NAME
+ -l, --log-dir=<directory> Directory in which to create the log files.
+ Default: $LOG_DIR
+ --postgres Database is PostgreSQL
+ --skip-hydro Skip import of hydrological data
+ --skip-morpho Skip import of morphological data
+ --skip-geo Skip import of geographic data
+ --skip-wst Skip import of wst data
+EOF
+exit 0
+}
+
+OPTS=`getopt -o ?u:w:h:p:d: \
+ -l help,username:,password:,host:,port:,db-name:,skip-hydro,skip-morpho,skip-geo,skip-wst,postgres \
+ -n $0 -- "$@"`
+if [ $? != 0 ] ; then usage; fi
+eval set -- "$OPTS"
+while true ; do
+ case "$1" in
+ "-?"|"--help")
+ usage;;
+ "--")
+ shift
+ break;;
+ "-u"|"--username")
+ DBUSER=$2
+ shift 2;;
+ "-w"|"--password")
+ DBPASS=$2
+ shift 2;;
+ "-h"|"--host")
+ DBHOST=$2
+ shift 2;;
+ "-p"|"--port")
+ DBPORT=$2
+ shift 2;;
+ "-l"|"--log-dir")
+ LOG=$2
+ shift 2;;
+ "-d"|"--db-name")
+ BACKEND_NAME=$2
+ shift 2;;
+ "--skip-hydro")
+ SKIP_HYDRO="TRUE"
+ shift;;
+ "--skip-morpho")
+ SKIP_MORPHO="TRUE"
+ shift;;
+ "--skip-wst")
+ SKIP_WST="TRUE"
+ shift;;
+ "--skip-geo")
+ SKIP_GEO="TRUE"
+ shift;;
+ "--postgres")
+ POSTGRES="TRUE"
+ shift;;
+ *)
+ echo "Unknown Option $1"
+ usage;;
+ esac
+done
+
+if [ -z $DBUSER ]; then
+ DBUSER=$DEFAULT_USER
+fi
+if [ -z $DBPASS ]; then
+ DBPASS=$DEFAULT_PASS
+fi
+if [ -z $DBPORT ]; then
+ DBPORT=$DEFAULT_PORT
+fi
+if [ -z $DBHOST ]; then
+ DBHOST=$DEFAULT_HOST
+fi
+if [ -z $BACKEND_NAME ]; then
+ BACKEND_NAME=$DEFAULT_BACKEND_NAME
+fi
+if [ -z $LOGDIR ]; then
+ LOG=$DEFAULT_LOG
+fi
+
+if [ $# != 1 ]; then
+ usage
+fi
+
+if [ ! -r $1 ]; then
+ echo "Could not open $1 please ensure it exists and is readable"
+fi
+
+GEW_FILE="$1"
+RIVER_NAME=$(grep "Gew.sser" "$1" | awk '{print $2}')
+DATE=$(date +%Y.%m.%d_%H%M)
+LOG_DIR=${LOG}/${RIVER_NAME}-$DATE
+mkdir -p ${LOG_DIR}
+
+if [ "$POSTGRES" = "TRUE" ]; then
+ JAR=$(echo "$JAR" | sed 's/importer/importer_psql/')
+ if [ ! -r "$JAR" ]; then
+ echo "Could not find Postgres importer $JAR"
+ exit 1
+ fi
+ OGR_CONNECTION="PG:dbname=$BACKEND_NAME host=$DBHOST port=$DBPORT \
+ user=$DBUSER password=$DBPASS"
+ BACKEND_DB_PREFIX="jdbc:postgresql:"
+ BACKEND_DB_DRIVER="org.postgresql.Driver"
+ BACKEND_DB_DIALECT="org.hibernate.dialect.PostgreSQLDialect"
+else
+ BACKEND_DB_PREFIX="jdbc:oracle:thin:@"
+ BACKEND_DB_DRIVER="oracle.jdbc.OracleDriver"
+ BACKEND_DB_DIALECT="org.hibernate.dialect.OracleDialect"
+fi
+
+BACKEND_URL=$BACKEND_DB_PREFIX//$DBHOST:$DBPORT/$BACKEND_NAME
+
+echo "Importing $RIVER_NAME into $BACKEND_URL."
+
+import_hydro(){
+ LOG_FILE=${LOG_DIR}/hydro.log
+ echo Importing Hydrological data.
+ echo Logging into: $LOG_FILE
+ sed 's!./import.log!'"$LOG_FILE"'!' conf/log4j.properties > $LOG_DIR/log4j.properties
+ java -jar \
+ -Xmx$MIN_MEMORY \
+ -server \
+ -Dlog4j.configuration=file://$LOG_DIR/log4j.properties \
+ -Dflys.backend.user=$DBUSER \
+ -Dflys.backend.password=$DBPASS \
+ -Dflys.backend.url=$BACKEND_URL \
+ -Dflys.backend.driver=$BACKEND_DB_DRIVER \
+ -Dflys.backend.dialect=$BACKEND_DB_DIALECT \
+ -Dflys.backend.importer.infogew.file="$GEW_FILE" \
+ -Dflys.backend.main.value.types=$IMPORTER_MAINVALUE_TYPES \
+ -Dflys.backend.importer.annotation.types=$IMPORTER_ANNOTATION_TYPES \
+ -Dflys.backend.importer.dry.run=$IMPORTER_DRY_RUN \
+ -Dflys.backend.importer.skip.annotations=false \
+ -Dflys.backend.importer.skip.bwastr=false \
+ -Dflys.backend.importer.skip.da50s=false \
+ -Dflys.backend.importer.skip.da66s=false \
+ -Dflys.backend.importer.skip.extra.wsts=false \
+ -Dflys.backend.importer.skip.fixations=false \
+ -Dflys.backend.importer.skip.flood.water=false \
+ -Dflys.backend.importer.skip.flood.protection=false \
+ -Dflys.backend.importer.skip.gauges=false \
+ -Dflys.backend.importer.skip.historical.discharge.tables=false \
+ -Dflys.backend.importer.skip.hyks=false \
+ -Dflys.backend.importer.skip.official.lines=false \
+ -Dflys.backend.importer.skip.prfs=false \
+ -Dflys.backend.importer.skip.w80s=false \
+ -Dflys.backend.importer.skip.wst=true \
+ -Dflys.backend.importer.skip.waterlevel.differences=true \
+ -Dflys.backend.importer.skip.waterlevels=true \
+ -Dflys.backend.importer.skip.sq.relation=true \
+ -Dflys.backend.importer.skip.sediment.density=true \
+ -Dflys.backend.importer.skip.sediment.yield=true \
+ -Dflys.backend.importer.skip.morphological.width=true \
+ -Dflys.backend.importer.skip.flow.velocity=true \
+ -Dflys.backend.importer.skip.bed.height.single=true \
+ -Dflys.backend.importer.skip.bed.height.epoch=true \
+ $JAR
+}
+
+import_morpho(){
+ LOG_FILE=${LOG_DIR}/morpho.log
+ echo Importing Morphological data.
+ echo Logging into: $LOG_FILE
+ sed 's!./import.log!'"$LOG_FILE"'!' conf/log4j.properties > $LOG_DIR/log4j.properties
+ java -jar \
+ -Xmx$MIN_MEMORY \
+ -server \
+ -Dlog4j.configuration=file://$LOG_DIR/log4j.properties \
+ -Dflys.backend.user=$DBUSER \
+ -Dflys.backend.password=$DBPASS \
+ -Dflys.backend.url=$BACKEND_URL \
+ -Dflys.backend.driver=$BACKEND_DB_DRIVER \
+ -Dflys.backend.dialect=$BACKEND_DB_DIALECT \
+ -Dflys.backend.importer.infogew.file="$GEW_FILE" \
+ -Dflys.backend.main.value.types=$IMPORTER_MAINVALUE_TYPES \
+ -Dflys.backend.importer.annotation.types=$IMPORTER_ANNOTATION_TYPES \
+ -Dflys.backend.importer.dry.run=$IMPORTER_DRY_RUN \
+ -Dflys.backend.importer.skip.annotations=true \
+ -Dflys.backend.importer.skip.bwastr=true \
+ -Dflys.backend.importer.skip.da50s=true \
+ -Dflys.backend.importer.skip.da66s=true \
+ -Dflys.backend.importer.skip.extra.wsts=true \
+ -Dflys.backend.importer.skip.fixations=true \
+ -Dflys.backend.importer.skip.flood.water=true \
+ -Dflys.backend.importer.skip.flood.protection=true \
+ -Dflys.backend.importer.skip.gauges=true \
+ -Dflys.backend.importer.skip.historical.discharge.tables=true \
+ -Dflys.backend.importer.skip.hyks=true \
+ -Dflys.backend.importer.skip.official.lines=true \
+ -Dflys.backend.importer.skip.prfs=true \
+ -Dflys.backend.importer.skip.w80s=true \
+ -Dflys.backend.importer.skip.wst=true \
+ -Dflys.backend.importer.skip.waterlevel.differences=false \
+ -Dflys.backend.importer.skip.waterlevels=false \
+ -Dflys.backend.importer.skip.sq.relation=false \
+ -Dflys.backend.importer.skip.sediment.density=false \
+ -Dflys.backend.importer.skip.sediment.yield=false \
+ -Dflys.backend.importer.skip.morphological.width=false \
+ -Dflys.backend.importer.skip.flow.velocity=false \
+ -Dflys.backend.importer.skip.bed.height.single=false \
+ -Dflys.backend.importer.skip.bed.height.epoch=false \
+ $JAR
+}
+
+import_wst(){
+ LOG_FILE=${LOG_DIR}/wst.log
+ echo Importing WST data.
+ echo Logging into: $LOG_FILE
+ sed 's!./import.log!'"$LOG_FILE"'!' conf/log4j.properties > $LOG_DIR/log4j.properties
+ java -jar \
+ -Xmx$MIN_MEMORY \
+ -server \
+ -Dlog4j.configuration=file://$LOG_DIR/log4j.properties \
+ -Dflys.backend.user=$DBUSER \
+ -Dflys.backend.password=$DBPASS \
+ -Dflys.backend.url=$BACKEND_URL \
+ -Dflys.backend.driver=$BACKEND_DB_DRIVER \
+ -Dflys.backend.dialect=$BACKEND_DB_DIALECT \
+ -Dflys.backend.importer.infogew.file="$GEW_FILE" \
+ -Dflys.backend.main.value.types=$IMPORTER_MAINVALUE_TYPES \
+ -Dflys.backend.importer.annotation.types=$IMPORTER_ANNOTATION_TYPES \
+ -Dflys.backend.importer.dry.run=$IMPORTER_DRY_RUN \
+ -Dflys.backend.importer.skip.annotations=true \
+ -Dflys.backend.importer.skip.bwastr=true \
+ -Dflys.backend.importer.skip.da50s=true \
+ -Dflys.backend.importer.skip.da66s=true \
+ -Dflys.backend.importer.skip.extra.wsts=true \
+ -Dflys.backend.importer.skip.fixations=true \
+ -Dflys.backend.importer.skip.flood.water=true \
+ -Dflys.backend.importer.skip.flood.protection=true \
+ -Dflys.backend.importer.skip.gauges=true \
+ -Dflys.backend.importer.skip.historical.discharge.tables=true \
+ -Dflys.backend.importer.skip.hyks=true \
+ -Dflys.backend.importer.skip.official.lines=true \
+ -Dflys.backend.importer.skip.prfs=true \
+ -Dflys.backend.importer.skip.w80s=true \
+ -Dflys.backend.importer.skip.wst=false \
+ -Dflys.backend.importer.skip.waterlevel.differences=true \
+ -Dflys.backend.importer.skip.waterlevels=true \
+ -Dflys.backend.importer.skip.sq.relation=true \
+ -Dflys.backend.importer.skip.sediment.density=true \
+ -Dflys.backend.importer.skip.sediment.yield=true \
+ -Dflys.backend.importer.skip.morphological.width=true \
+ -Dflys.backend.importer.skip.flow.velocity=true \
+ -Dflys.backend.importer.skip.bed.height.single=true \
+ -Dflys.backend.importer.skip.bed.height.epoch=true \
+ $JAR
+}
+
+import_geo(){
+ LOG_FILE=${LOG_DIR}/geo.log
+ echo Importing Geographic data.
+ echo Logging into: $LOG_FILE
+
+ RIVER_PATH=$(grep "WSTDatei:" "$GEW_FILE" | awk '{print $2}')
+ RIVER_PATH=$(dirname "$RIVER_PATH")/../..
+ RIVER_PATH=$(readlink -f "$RIVER_PATH")
+
+ exec python $(dirname $0)/geodaesie/shpimporter.py \
+ --directory $RIVER_PATH \
+ --river_name $RIVER_NAME \
+ --ogr_connection "$OGR_CONNECTION" \
+ --host $DBHOST \
+ --user $DBUSER \
+ --password $DBPASS \
+ --verbose 1 \
+ > "$LOG_FILE" 2>&1
+}
+
+
+if [ "$SKIP_HYDRO" != "TRUE" ]; then
+import_hydro
+fi
+if [ "$SKIP_WST" != "TRUE" ]; then
+import_wst
+fi
+if [ "$SKIP_MORPHO" != "TRUE" ]; then
+import_morpho
+fi
+if [ "$SKIP_GEO" != "TRUE" ]; then
+import_geo
+fi
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/run_geo.sh
--- a/flys-backend/contrib/run_geo.sh Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/run_geo.sh Fri Mar 22 11:25:54 2013 +0100
@@ -2,8 +2,7 @@
# Required
RIVER_PATH="/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Saar"
-RIVER_ID=1
-TARGET_SRS=31467
+RIVER_NAME="Saar"
# Set this to your target database for Oracle
HOST=localhost
@@ -17,7 +16,6 @@
SKIP_AXIS=0
SKIP_KMS=0
SKIP_CROSSSECTIONS=0
-SKIP_LINES=0
SKIP_FIXPOINTS=0
SKIP_BUILDINGS=0
SKIP_FLOODPLAINS=0
@@ -27,15 +25,27 @@
SKIP_GAUGE_LOCATION=0
SKIP_CATCHMENTS=0
SKIP_UESG=0
+SKIP_DGM=0
+SKIP_JETTIES=0
+# There should be no need to change anything below this line
DIR=`dirname $0`
DIR=`readlink -f "$DIR"`
+OPTIONAL_LIBS="${DIR}"/opt
+if [ -d "$OPTIONAL_LIBS" ]; then
+ export PATH="$OPTIONAL_LIBS/bin:$PATH"
+ export LD_LIBRARY_PATH="$OPTIONAL_LIBS/lib:$LD_LIBRARY_PATH"
+ export LD_LIBRARY_PATH="$OPTIONAL_LIBS/lib64:$LD_LIBRARY_PATH"
+ export PYTHONPATH="$OPTIONAL_LIBS/lib/python2.6/site-packages:$PYTHONPATH"
+ export PYTHONPATH="$OPTIONAL_LIBS/lib64/python2.6/site-packages:$PYTHONPATH"
+ export GDAL_DATA="$OPTIONAL_LIBS/share/gdal"
+fi
+
exec python $DIR/shpimporter/shpimporter.py \
--directory $RIVER_PATH \
- --river_id $RIVER_ID \
- --target_srs $TARGET_SRS \
+ --river_name $RIVER_NAME \
--ogr_connection "$OGR_CONNECTION" \
--host $HOST \
--user $USER \
@@ -44,13 +54,13 @@
--skip_axis $SKIP_AXIS \
--skip_kms $SKIP_KMS \
--skip_crosssections $SKIP_CROSSSECTIONS \
- --skip_lines $SKIP_LINES \
--skip_fixpoints $SKIP_FIXPOINTS \
--skip_buildings $SKIP_BUILDINGS \
--skip_floodplains $SKIP_FLOODPLAINS \
--skip_hydr_boundaries $SKIP_HYDR_BOUNDARIES \
--skip_gauge_locations $SKIP_GAUGE_LOCATION \
- --skip_catchments $SKIP_CATCHMENTS \
--skip_uesgs $SKIP_UESG \
--skip_hws_lines $SKIP_HWS_LINES \
- --skip_hws_points $SKIP_HWS_POINTS
+ --skip_hws_points $SKIP_HWS_POINTS \
+ --skip_dgm $SKIP_DGM \
+ --skip_jetties $SKIP_JETTIES
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/run_hydr_morph.sh
--- a/flys-backend/contrib/run_hydr_morph.sh Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/run_hydr_morph.sh Fri Mar 22 11:25:54 2013 +0100
@@ -8,6 +8,7 @@
BACKEND_PORT="1521"
BACKEND_NAME="XE"
LOG4J_CONFIG="conf/log4j.properties"
+JAR="hydr_morph/importer.jar"
#####################################################################
@@ -55,25 +56,17 @@
MIN_MEMORY="1024m"
-########################## Importer Settings ########################
-APP="de.intevation.flys.importer.Importer"
-DIR=`dirname $0`
-DIR=`readlink -f "$DIR/.."`
-#####################################################################
+######################### Run Importer ##############################
+OPTIONAL_LIBS="${DIR}"/../opt
+if [ -d "$OPTIONAL_LIBS" ]; then
+ export PATH="$OPTIONAL_LIBS/bin:$PATH"
+ export LD_LIBRARY_PATH="$OPTIONAL_LIBS/lib:$LD_LIBRARY_PATH"
+ export LD_LIBRARY_PATH="$OPTIONAL_LIBS/lib64:$LD_LIBRARY_PATH"
+fi
+export LC_ALL=de_DE at euro # Workaround encoding problem
-########################## Collect required libraries ###############
-CLASSPATH=
-for l in `find "$DIR/lib" -name \*.jar -print`; do
- CLASSPATH=$CLASSPATH:$l
-done
-
-export CLASSPATH
-#####################################################################
-
-
-######################### Run Importer ##############################
-exec java \
+exec java -jar \
-Xmx$MIN_MEMORY \
-server \
-Dlog4j.configuration=file://`readlink -f $LOG4J_CONFIG` \
@@ -110,4 +103,4 @@
-Dflys.backend.url=$BACKEND_URL \
-Dflys.backend.driver=$BACKEND_DB_DRIVER \
-Dflys.backend.dialect=$BACKEND_DB_DIALECT \
- $APP
+ $JAR
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/axis.py
--- a/flys-backend/contrib/shpimporter/axis.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/axis.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,7 +1,10 @@
-import ogr
+try:
+ from osgeo import ogr
+except ImportError:
+ import ogr
from importer import Importer
-import shpimporter
+import utils
NAME="Axis"
TABLE_NAME="river_axes"
@@ -23,16 +26,20 @@
def isGeometryValid(self, geomType):
- return geomType == 2
-
+ return geomType in [ogr.wkbLineString,
+ ogr.wkbLineString25D,
+ ogr.wkbMultiLineString25D,
+ ogr.wkbMultiLineString]
def isShapeRelevant(self, name, path):
- return name == "achse" or name.find("achse") >= 0
+ return "km.shp" not in path.lower()
def createNewFeature(self, featureDef, feat, **args):
newFeat = ogr.Feature(featureDef)
- newFeat.SetGeometry(feat.GetGeometryRef())
+ geometry = feat.GetGeometryRef()
+ geometry.SetCoordinateDimension(3)
+ newFeat.SetGeometry(geometry)
newFeat.SetField("name", args['name'])
if self.IsFieldSet(feat, "river_id"):
@@ -40,13 +47,10 @@
else:
riverId = self.river_id
- if self.IsFieldSet(feat, "kind"):
- kind = feat.GetField("kind")
+ newFeat.SetField("river_id", riverId)
+ if args.get("name", "").lower() == "achse":
+ newFeat.SetField("kind_id", 1) # 1 is Current
else:
- kind = 0
+ newFeat.SetField("kind_id", 2) # 2 Is Other
- newFeat.SetField("river_id", riverId)
- newFeat.SetField("kind", kind)
-
- return newFeat
-
+ return utils.convertToMultiLine(newFeat)
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/boundaries.py
--- a/flys-backend/contrib/shpimporter/boundaries.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/boundaries.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,10 +1,14 @@
-import ogr
+try:
+ from osgeo import ogr
+except ImportError:
+ import ogr
from importer import Importer
+import utils
TABLE_NAME="hydr_boundaries"
TABLE_NAME_POLY="hydr_boundaries_poly"
-PATH="Hydrologie/Hydr.Grenzen/Linien"
+PATH="Hydrologie/Hydr.Grenzen"
NAME="Hydr. Boundaries"
@@ -13,29 +17,77 @@
def getPath(self, base):
return "%s/%s" % (base, PATH)
-
def getTablename(self):
return TABLE_NAME
-
def getName(self):
return NAME
+ def isGeometryValid(self, geomType):
+ return geomType in [ogr.wkbLineString,
+ ogr.wkbLineString25D,
+ ogr.wkbMultiLineString25D,
+ ogr.wkbMultiLineString]
+
+ def isShapeRelevant(self, name, path):
+ shp = ogr.Open(path)
+ if self.isGeometryValid(shp.GetLayerByName(name).GetGeomType()) and \
+ self.getKind(path) > 0:
+ return True
+ else:
+ return False
+
+ def getKind(self, path):
+ if "linien/bfg" in path.lower():
+ return 1
+ elif "linien/land" in path.lower():
+ return 2
+ elif "/sonstige/" in path.lower():
+ return 3
+ else:
+ return 0
+
+ def createNewFeature(self, featureDef, feat, **args):
+ kind = self.getKind(args['path'])
+
+ newFeat = ogr.Feature(featureDef)
+ geometry = feat.GetGeometryRef()
+ geometry.SetCoordinateDimension(3)
+
+ newFeat.SetGeometry(geometry)
+ newFeat.SetField("name", args['name'])
+ newFeat.SetField("kind", kind)
+ if self.IsFieldSet(feat, "SECTIE"):
+ newFeat.SetField("sectie", feat.GetField("SECTIE"))
+
+ if self.IsFieldSet(feat, "SOBEK"):
+ newFeat.SetField("sobek", feat.GetField("SOBEK"))
+
+ if self.IsFieldSet(feat, "river_id"):
+ newFeat.SetField("river_id", feat.GetField("river_id"))
+ else:
+ newFeat.SetField("river_id", self.river_id)
+
+ return utils.convertToMultiLine(newFeat)
+
+class HydrBoundaryPoly(HydrBoundary):
+
+ def getTablename(self):
+ return TABLE_NAME_POLY
+
+ def getName(self):
+ return "%s (Polygons)" % NAME
def isGeometryValid(self, geomType):
- return geomType == 2
-
+ return geomType == ogr.wkbPolygon or geomType == ogr.wkbMultiPolygon
def isShapeRelevant(self, name, path):
- return True
-
-
- def getKind(self, path):
- if path.find("BfG") > 0:
- return 1
+ shp = ogr.Open(path)
+ if self.isGeometryValid(shp.GetLayerByName(name).GetGeomType()) and \
+ self.getKind(path) > 0:
+ return True
else:
- return 2
-
+ return False
def createNewFeature(self, featureDef, feat, **args):
kind = self.getKind(args['path'])
@@ -48,44 +100,17 @@
newFeat.SetField("name", args['name'])
newFeat.SetField("kind", kind)
- if self.IsFieldSet(feat, "river_id"):
- newFeat.SetField("river_id", feat.GetField("river_id"))
- else:
- newFeat.SetField("river_id", self.river_id)
+ if self.IsFieldSet(feat, "SECTIE"):
+ newFeat.SetField("sectie", feat.GetField("SECTIE"))
- return newFeat
+ if self.IsFieldSet(feat, "SOBEK"):
+ newFeat.SetField("sobek", feat.GetField("SOBEK"))
-
-class HydrBoundaryPoly(HydrBoundary):
-
- def getTablename(self):
- return TABLE_NAME_POLY
-
-
- def getName(self):
- return "%s (Polygons)" % NAME
-
-
- def isGeometryValid(self, geomType):
- return geomType == 3 or geomType == 6
-
-
- def createNewFeature(self, featureDef, feat, **args):
- kind = self.getKind(args['path'])
-
- newFeat = ogr.Feature(featureDef)
- geometry = feat.GetGeometryRef()
- geometry.SetCoordinateDimension(2)
-
- newFeat.SetGeometry(geometry)
- newFeat.SetField("name", args['name'])
- newFeat.SetField("kind", kind)
-
if self.IsFieldSet(feat, "river_id"):
newFeat.SetField("river_id", feat.GetField("river_id"))
else:
newFeat.SetField("river_id", self.river_id)
- return newFeat
+ return utils.convertToMultiPolygon(newFeat)
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/buildings.py
--- a/flys-backend/contrib/shpimporter/buildings.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/buildings.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,4 +1,7 @@
-import ogr
+try:
+ from osgeo import ogr
+except ImportError:
+ import ogr
from importer import Importer
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/catchments.py
--- a/flys-backend/contrib/shpimporter/catchments.py Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,53 +0,0 @@
-import ogr
-
-from importer import Importer
-
-TABLE_NAME="catchment"
-PATH="Hydrologie/Einzugsgebiet"
-NAME="Catchments"
-
-
-class Catchment(Importer):
-
- def getPath(self, base):
- return "%s/%s" % (base, PATH)
-
-
- def getTablename(self):
- return TABLE_NAME
-
-
- def getName(self):
- return NAME
-
-
- def isGeometryValid(self, geomType):
- return geomType == 3 or geomType == 6
-
-
- def isShapeRelevant(self, name, path):
- return True
-
-
- def createNewFeature(self, featureDef, feat, **args):
- newFeat = ogr.Feature(featureDef)
- geometry = feat.GetGeometryRef()
- geometry.SetCoordinateDimension(2)
-
- newFeat.SetGeometry(geometry)
-
- if self.IsFieldSet(feat, "river_id"):
- newFeat.SetField("river_id", feat.GetField("river_id"))
- else:
- newFeat.SetField("river_id", self.river_id)
-
- if self.IsFieldSet(feat, "Name"):
- newFeat.SetField("name", feat.GetField("name"))
- else:
- newFeat.SetField("name", args['name'])
-
- if self.IsFieldSet(feat, "AREA"):
- newFeat.SetField("area", feat.GetField("area"))
-
- return newFeat
-
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/crosssectiontracks.py
--- a/flys-backend/contrib/shpimporter/crosssectiontracks.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/crosssectiontracks.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,4 +1,7 @@
-import ogr
+try:
+ from osgeo import ogr
+except ImportError:
+ import ogr
from importer import Importer
@@ -34,6 +37,12 @@
newFeat.SetGeometry(feat.GetGeometryRef())
newFeat.SetField("name", args['name'])
+ if args['path'].lower().endswith("/qps.shp") and \
+ not "sonstige" in args['path'].lower():
+ newFeat.SetField("kind_id", 1) # offical
+ else:
+ newFeat.SetField("kind_id", 0) # misc
+
if self.IsFieldSet(feat, "river_id"):
newFeat.SetField("river_id", feat.GetField("river_id"))
else:
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/dgm.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/contrib/shpimporter/dgm.py Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,154 @@
+# -*- coding: utf-8 -*-
+
+import codecs
+import utils
+import datetime
+
+def latin(string):
+ return unicode(string, "latin1")
+
+import logging
+logger = logging.getLogger("DGM")
+
+
+# <dbfield> : (<csvfield>, conversion function)
+DGM_MAP = {
+ "projection" : "Projektion",
+ "elevation_state" : latin("Höhenstatus"),
+ "format" : "Format",
+ "border_break" : ("Bruchkanten",
+ lambda x: True if x.lower() == "Ja" else False),
+ "resolution" : (latin("Auflösung"), lambda x: x),
+# "description" :
+ "srid" : "SRID",
+ "path" : ("Pfad_Bestand", lambda x: x),
+ }
+
+SQL_INSERT_DGT = "INSERT INTO dem (river_id, name," \
+ " time_interval_id, range_id, " + ", ".join(DGM_MAP.keys()) + \
+ ") VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)"
+SQL_INSERT_DGT_ORA = "INSERT INTO dem (river_id, name," \
+ " time_interval_id, range_id, " + ", ".join(DGM_MAP.keys()) + \
+ ") VALUES (:s, :s, :s, :s, :s, :s, :s, :s, :s, :s, :s)"
+SQL_SELECT_TIME_ID = """
+SELECT id FROM time_intervals WHERE start_time = %s AND stop_time = %s
+"""
+SQL_INSERT_TIME_ID = """
+INSERT INTO time_intervals (id, start_time, stop_time) VALUES (%s, %s, %s)
+"""
+SQL_SELECT_TIME_ID_ORA = """
+SELECT id FROM time_intervals WHERE start_time = :s AND stop_time = :s
+"""
+SQL_INSERT_TIME_ID_ORA = """
+INSERT INTO time_intervals (id, start_time, stop_time) VALUES (:s, :s, :s)
+"""
+SQL_SELECT_RANGE_ID = """
+SELECT id FROM ranges WHERE river_id = %s AND a = %s AND b = %s
+"""
+SQL_INSERT_RANGE_ID = """
+INSERT INTO ranges (id, river_id, a, b) VALUES (%s, %s, %s, %s)
+"""
+SQL_SELECT_RANGE_ID_ORA = """
+SELECT id FROM ranges WHERE river_id = :s AND a = :s AND b = :s
+"""
+SQL_INSERT_RANGE_ID_ORA = """
+INSERT INTO ranges (id, river_id, a, b) VALUES (:s, :s, :s, :s)
+"""
+SQL_NEXT_ID = "select nextval('%s_ID_SEQ')"
+SQL_NEXT_ID_ORA = "select %s_ID_SEQ.nextval FROM dual"
+
+def next_id(cur, relation, oracle):
+ if oracle:
+ cur.execute(SQL_NEXT_ID_ORA % relation.upper())
+ else:
+ cur.execute(SQL_NEXT_ID % relation.upper())
+ idx = cur.fetchone()[0]
+ return idx
+
+def get_range_id(cur, river_id, a, b, oracle):
+ if oracle:
+ cur.execute(SQL_SELECT_RANGE_ID_ORA, (river_id, a, b))
+ else:
+ cur.execute(SQL_SELECT_RANGE_ID, (river_id, a, b))
+ row = cur.fetchone()
+ if row: return row[0]
+ idx = next_id(cur, "ranges", oracle)
+ if oracle:
+ cur.execute(SQL_INSERT_RANGE_ID_ORA, (idx, river_id, a, b))
+ else:
+ cur.execute(SQL_INSERT_RANGE_ID, (idx, river_id, a, b))
+ cur.connection.commit()
+ return idx
+
+def get_time_interval_id(cur, a, b, oracle):
+ if not a or not b:
+ return None
+ if oracle:
+ cur.execute(SQL_SELECT_TIME_ID_ORA, (a, b))
+ else:
+ cur.execute(SQL_SELECT_TIME_ID, (a, b))
+ row = cur.fetchone()
+ if row: return row[0]
+ idx = next_id(cur, "time_intervals", oracle)
+ if oracle:
+ cur.execute(SQL_INSERT_TIME_ID_ORA, (idx, a, b))
+ else:
+ cur.execute(SQL_INSERT_TIME_ID, (idx, a, b))
+ cur.connection.commit()
+ return idx
+
+def insertRiverDgm(dbconn, dgmfile, river_name, dry_run, oracle):
+ with codecs.open(dgmfile, "r", "latin1") as csvfile:
+ firstline = csvfile.readline()
+ names = firstline.split(";")
+ namedict = {}
+ field_nr = 0
+ for name in names:
+ namedict[name] = field_nr
+ field_nr += 1
+
+ river_id = utils.getRiverId(dbconn, river_name, oracle)
+ for line in csvfile:
+ fields = line.split(";")
+ if not fields: continue
+ if fields[namedict[latin("Gewässer")]] != \
+ unicode(utils.getUTF8(river_name),'UTF-8'):
+ continue
+ else:
+ values=[]
+ for key, val in DGM_MAP.items():
+ if isinstance(val, tuple):
+ values.append(val[1](fields[namedict[val[0]]]))
+ else:
+ values.append(unicode.encode(
+ fields[namedict[val]], "UTF-8"))
+ km_von = fields[namedict["km_von"]]
+ km_bis = fields[namedict["km_bis"]]
+ year_from = None
+ year_to = None
+ try:
+ year_from = datetime.datetime(
+ int(fields[namedict["Jahr_von"]]), 1, 1)
+ year_to = datetime.datetime(
+ int(fields[namedict["Jahr_bis"]]),1 ,1)
+ except ValueError:
+ logger.warn("Invalid numbers (or none) found in year_from and year_to")
+
+ name = "%s KM %s - %s" % (river_name, km_von, km_bis)
+ cur = dbconn.cursor()
+ range_id = get_range_id(cur, river_id, float(km_von),
+ float(km_bis), oracle)
+ time_interval_id = get_time_interval_id(cur, year_from,
+ year_to, oracle)
+
+ if oracle:
+ stmt = SQL_INSERT_DGT_ORA
+ else:
+ stmt = SQL_INSERT_DGT
+
+ cur.execute(stmt, [river_id, name, time_interval_id,
+ range_id] + values)
+
+ if not dry_run:
+ dbconn.commit()
+
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/fixpoints.py
--- a/flys-backend/contrib/shpimporter/fixpoints.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/fixpoints.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,6 +1,12 @@
-import ogr, osr
+try:
+ from osgeo import ogr, osr
+except ImportError:
+ import ogr, osr
from importer import Importer
+import logging
+logger = logging.getLogger("Fixpoints")
+fixpoints_no_km_logged=False
TABLE_NAME="fixpoints"
PATH="Geodaesie/Festpunkte"
@@ -31,9 +37,11 @@
def createNewFeature(self, featureDef, feat, **args):
newFeat = ogr.Feature(featureDef)
+
geometry = feat.GetGeometryRef()
+ geometry.SetCoordinateDimension(2)
+ newFeat.SetGeometry(geometry)
- newFeat.SetGeometry(geometry)
newFeat.SetField("name", args['name'])
if self.IsFieldSet(feat, "river_id"):
@@ -46,6 +54,10 @@
elif self.IsFieldSet(feat, "ELBE_KM"):
newFeat.SetField("km", feat.GetFieldAsDouble("ELBE_KM"))
else:
+ global fixpoints_no_km_logged
+ if not fixpoints_no_km_logged:
+ logger.error("Could not find KM attribute")
+ fixpoints_no_km_logged = True
return None
if self.IsFieldSet(feat, "X"):
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/floodplains.py
--- a/flys-backend/contrib/shpimporter/floodplains.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/floodplains.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,4 +1,7 @@
-import ogr
+try:
+ from osgeo import ogr
+except ImportError:
+ import ogr
from importer import Importer
@@ -36,6 +39,12 @@
newFeat.SetGeometry(geometry)
newFeat.SetField("name", args['name'])
+ if args['path'].lower().endswith("/talaue.shp") and \
+ not "sonstige" in args['path'].lower():
+ newFeat.SetField("kind_id", 1) # offical
+ else:
+ newFeat.SetField("kind_id", 0) # misc
+
if self.IsFieldSet(feat, "river_id"):
newFeat.SetField("river_id", feat.GetField("river_id"))
else:
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/gauges.py
--- a/flys-backend/contrib/shpimporter/gauges.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/gauges.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,4 +1,7 @@
-import ogr
+try:
+ from osgeo import ogr
+except ImportError:
+ import ogr
from importer import Importer
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/hws.py
--- a/flys-backend/contrib/shpimporter/hws.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/hws.py Fri Mar 22 11:25:54 2013 +0100
@@ -3,44 +3,196 @@
try:
from osgeo import ogr
-except ImportErrror:
+except ImportError:
import ogr
from importer import Importer
import utils
+import logging
+logger = logging.getLogger("HWS")
+
PATH="Hydrologie/HW-Schutzanlagen"
NAME="HWS"
# Keep in sync with hws_kinds table:
+# strings need to be lowercase
HWS_KIND = {
- "Durchlass" : 1,
- "Damm" : 2,
- "Deich" : 2,
- "Graben" : 3,
+ "durchlass" : 1,
+ "damm" : 2,
+ "deich" : 2,
+ "hochufer" : 2,
+ "graben" : 3,
+ "rohr1" : 1,
+ "rohr 1" : 1,
+ "rohr 2" : 1,
+ "hauptdeich" : 2,
+ "sommerdeich" : 2
}
# Keep in sync with fed_states table:
+# strings need to be lowercase
FED_STATES = {
- "Bayern" : 1,
- "Hessen" : 2,
- "Niedersachsen" : 3,
- "Nordrhein-Westfalen" : 4,
- "Rheinland-Pfalz" : 5,
- "Saarland" : 6,
- "Schleswig-Holstein" : 7,
- "Brandenburg" : 8,
- "Mecklenburg-Vorpommern" : 9,
- "Thüringen" : 10,
- "Baden-Württemberg" : 11,
- "Sachsen-Anhalt" : 12,
- "Sachsen" : 13,
- "Berlin" : 14,
- "Bremen" : 15,
- "Hamburg" : 16,
+ "bayern" : 1,
+ "hessen" : 2,
+ "niedersachsen" : 3,
+ "nordrhein-westfalen" : 4,
+ "nordrhein westfalen" : 4,
+ "rheinland-pfalz" : 5,
+ "rheinland pfalz" : 5,
+ "saarland" : 6,
+ "schleswig-holstein" : 7,
+ "schleswig holstein" : 7,
+ "brandenburg" : 8,
+ "mecklenburg-vorpommern" : 9,
+ "mecklenburg vorpommern" : 9,
+ "thüringen" : 10,
+ "baden-württemberg" : 11,
+ "baden württemberg" : 11,
+ "sachsen-anhalt" : 12,
+ "sachsen anhalt" : 12,
+ "sachsen" : 13,
+ "berlin" : 14,
+ "bremen" : 15,
+ "hamburg" : 16,
}
-class HWSLines(Importer):
+class HWSPoints(Importer):
+ fieldmap = {
+ "name$" : "name",
+ "quelle$" : "source",
+ "anmerkung$" : "description",
+ "stand$" : "status_date",
+ "verband$" : "agency",
+ "Deich_{0,1}KM$" : "dike_km",
+ "Bereich$" : "range",
+ "H[oeö]{0,2}he_{0,1}SOLL$" : "z_target",
+ "(WSP_){0,1}BfG_{0,1}100$" : "rated_level",
+ "H[oeö]{0,2}he_{0,1}IST$" : "z",
+ }
+
+ printedforpath=[]
+
+ def getPath(self, base):
+ return "%s/%s" % (base, PATH)
+
+ def getTablename(self):
+ return "hws_points"
+
+ def getName(self):
+ return "HWS_POINTS"
+
+ def isGeometryValid(self, geomType):
+ return geomType == ogr.wkbPoint or geomType == ogr.wkbPoint25D
+
+ def isShapeRelevant(self, name, path):
+ shp = ogr.Open(path)
+ return self.isGeometryValid(shp.GetLayerByName(name).GetGeomType())
+
+ def getFedStateIDfromPath(self, path):
+ """
+ Tries to get extract a bundesland from the path
+ """
+ for state in sorted(FED_STATES.keys(), key = len, reverse = True):
+ if state in path.lower():
+ if not path in self.printedforpath:
+ logger.info("Extracted federal state from path: %s" % state)
+ self.printedforpath.append(path)
+ return FED_STATES[state]
+
+ def createNewFeature(self, featureDef, feat, **args):
+ newFeat = ogr.Feature(featureDef)
+ geometry = feat.GetGeometryRef()
+ geometry.SetCoordinateDimension(2)
+
+ self.copyFields(feat, newFeat, self.fieldmap)
+
+ newFeat.SetGeometry(geometry)
+
+ artname = self.searchField("art$")
+ if self.IsFieldSet(feat, artname):
+ self.handled(artname)
+ kind_id = HWS_KIND.get(feat.GetField(artname).lower())
+ if not kind_id:
+ logger.warn("Unknown Art: %s" % \
+ feat.GetField(artname))
+ else:
+ newFeat.SetField("kind_id", kind_id)
+
+ fname = self.searchField("Bundesland$")
+ if self.IsFieldSet(feat, fname):
+ self.handled(fname)
+ fed_id = FED_STATES.get(feat.GetField(fname).lower())
+
+ if not fed_id:
+ logger.warn("Unknown Bundesland: %s" % \
+ feat.GetField(fname))
+ else:
+ newFeat.SetField("fed_state_id", fed_id)
+ else:
+ # Try to get the bundesland from path
+ fed_id = self.getFedStateIDfromPath(args['path'])
+ if fed_id:
+ newFeat.SetField("fed_state_id", fed_id)
+
+ fname = self.searchField("(ufer$)|(flussseite$)")
+ if self.IsFieldSet(feat, fname):
+ self.handled(fname)
+ shoreString = feat.GetField(fname)
+ if "links" in shoreString.lower():
+ newFeat.SetField("shore_side", True)
+ elif "rechts" in shoreString.lower():
+ newFeat.SetField("shore_side", False)
+
+
+ fname = self.searchField("river_{0,1}id$")
+ if self.IsFieldSet(feat, fname):
+ self.handled(fname)
+ if feat.GetField(fname) != self.river_id:
+ logger.warn("River_id mismatch between shapefile and"
+ " importer parameter.")
+ newFeat.SetField("river_id", feat.GetField(fname))
+ else:
+ newFeat.SetField("river_id", self.river_id)
+
+ fname = self.searchField("name$")
+ if not self.IsFieldSet(feat, fname):
+ newFeat.SetField("name", args['name'])
+
+ fname = self.searchField("offiziell$")
+ if self.IsFieldSet(feat, fname):
+ self.handled(fname)
+ offiziell = feat.GetField(fname)
+ if offiziell == "1" or offiziell == 1:
+ newFeat.SetField("official", True)
+ else:
+ newFeat.SetField("official", False)
+ # Set the official value based on the file name as a fallback
+ elif args.get("name", "").lower() == "rohre_und_sperren" or \
+ args.get("name", "").lower() == "rohre-und-sperren":
+ newFeat.SetField("official", True)
+
+ if self.IsFieldSet(newFeat, "z") and \
+ self.IsFieldSet(newFeat, "rated_level"):
+ fname = self.searchField("freibord(_m){0,1}$")
+ self.handled(fname)
+ z = newFeat.GetFieldAsDouble("z")
+ rl = newFeat.GetFieldAsDouble("rated_level")
+ newFeat.SetField("freeboard", z - rl)
+
+ return newFeat
+
+class HWSLines(HWSPoints):
+
+ # TODO: GEOM_target, GEOM_rated_level, dike_km_from, dike_km_to
+ fieldmap = {
+ "name$" : "name",
+ "quelle$" : "source",
+ "anmerkung$" : "description",
+ "stand$" : "status_date",
+ "verband$" : "agency",
+ "Bereich$" : "range",
+ }
def getPath(self, base):
return "%s/%s" % (base, PATH)
@@ -52,136 +204,26 @@
return "HWS_LINES"
def isGeometryValid(self, geomType):
- return geomType == 2
+ return geomType in [ogr.wkbLineString,
+ ogr.wkbLineString25D,
+ ogr.wkbMultiLineString25D,
+ ogr.wkbMultiLineString]
def isShapeRelevant(self, name, path):
- return True
+ shp = ogr.Open(path)
+ return self.isGeometryValid(shp.GetLayerByName(name).GetGeomType())
def createNewFeature(self, featureDef, feat, **args):
- newFeat = ogr.Feature(featureDef)
+ newFeat = HWSPoints.createNewFeature(self, featureDef, feat, **args)
geometry = feat.GetGeometryRef()
- geometry.SetCoordinateDimension(2)
-
+ if geometry.GetCoordinateDimension() == 2:
+ geometry.SetCoordinateDimension(3)
+ for i in range(0, geometry.GetPointCount()):
+ x,y,z = geometry.GetPoint(i)
+ z = 9999
+ geometry.SetPoint(i, x, y, z)
newFeat.SetGeometry(geometry)
- if self.IsFieldSet(feat, "river_id"):
- newFeat.SetField("river_id", feat.GetField("river_id"))
- else:
- newFeat.SetField("river_id", self.river_id)
+ return utils.convertToMultiLine(newFeat)
- if self.IsFieldSet(feat, "TYP"):
- newFeat.SetField("type", feat.GetField("TYP"))
- if self.IsFieldSet(feat, "Bauart"):
- newFeat.SetField("hws_facility", feat.GetField("Bauart"))
-
- if self.IsFieldSet(feat, "Name"):
- newFeat.SetField("name", feat.GetField("name"))
- else:
- newFeat.SetField("name", args['name'])
-
- return newFeat
-
-class HWSPoints(Importer):
- fieldmap = {
- "Name" : "name",
- "Quelle" : "source",
- "Anmerkung" : "description",
- "Stand" : "status_date",
- "Verband" : "agency",
- "Deich_KM" : "dike_km",
- "Bereich" : "range",
- "Höhe_SOLL" : "z_target",
- "WSP_BfG100" : "rated_level",
- "Hoehe_IST" : "z",
- }
-
- def getPath(self, base):
- return "%s/%s" % (base, PATH)
-
- def getTablename(self):
- return "hws_points"
-
- def getName(self):
- return "HWS_POINTS"
-
- def isGeometryValid(self, geomType):
- return geomType == 1
-
- def isShapeRelevant(self, name, path):
- if "punkte" in os.path.basename(path).lower():
- return True
- else:
- return False
-
- def createNewFeature(self, featureDef, feat, **args):
- newFeat = ogr.Feature(featureDef)
- geometry = feat.GetGeometryRef()
- geometry.SetCoordinateDimension(2)
-
- self.copyFields(feat, newFeat, self.fieldmap)
-
- newFeat.SetGeometry(geometry)
-
- newFeat.SetFID(feat.GetFID())
-
- newFeat.SetField("ogr_fid", feat.GetFID())
-
- if self.IsFieldSet(feat, "Art"):
- self.handled("Art")
- kind_id = HWS_KIND.get(feat.GetField("Art"))
- if not kind_id:
- print ("Unbekannte Art: %s" % \
- feat.GetField("Art"))
- else:
- newFeat.SetField("kind_id", kind_id)
-
- if self.IsFieldSet(feat, "Bundesland"):
- self.handled("Bundesland")
- fed_id = FED_STATES.get(feat.GetField("Bundesland"))
-
- if not fed_id:
- print ("Unbekanntes Bundesland: %s" % \
- feat.GetField("Bundesland"))
- else:
- newFeat.SetField("fed_state_id", fed_id)
-
- if self.IsFieldSet(feat, "river_id"):
- self.handled("river_id")
- if feat.GetField("river_id") != self.river_id:
- print ("River_id mismatch between shapefile and"
- " importer parameter.")
- newFeat.SetField("river_id", feat.GetField("river_id"))
- else:
- newFeat.SetField("river_id", self.river_id)
-
- if self.IsFieldSet(feat, "Ufer"):
- self.handled("Ufer")
- shoreString = feat.GetField("Ufer")
- if "links" in shoreString.lower():
- newFeat.SetField("shore_side", True)
- elif "rechts" in shoreString.lower():
- newFeat.SetField("shore_side", False)
-
- if not self.IsFieldSet(feat, "Name"):
- self.handled("Name")
- newFeat.SetField("name", args['name'])
-
- if self.IsFieldSet(feat, "offiziell"):
- self.handled("offiziell")
- offiziell = feat.GetField("offiziell")
- if offiziell == "1" or offiziell == 1:
- newFeat.SetField("offiziell", True)
- else:
- newFeat.SetField("offiziell", False)
-
- if self.IsFieldSet(newFeat, "z") and \
- self.IsFieldSet(newFeat, "rated_level"):
- self.handled("Freibord_m")
- z = newFeat.GetFieldAsDouble("z")
- rl = newFeat.GetFieldAsDouble("rated_level")
- newFeat.SetField("freeboard", z - rl)
-
- return newFeat
-
-
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/importer.py
--- a/flys-backend/contrib/shpimporter/importer.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/importer.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,21 +1,24 @@
try:
- from osgeo import ogr
-except ImportErrror:
- import ogr
-import osr
-import shpimporter
+ from osgeo import ogr, osr
+except ImportError:
+ import ogr, osr
import utils
+import re
+import logging
+
+logger = logging.getLogger("importer")
class Importer:
- def __init__(self, config, dbconn):
- self.config = config
+ def __init__(self, river_id, dbconn, dry_run):
+ self.river_id = river_id
self.dbconn = dbconn
- self.river_id = config.river_id
+ self.dry_run = dry_run
self.dest_srs = osr.SpatialReference()
- self.dest_srs.ImportFromEPSG(config.target_srs)
+ self.dest_srs.ImportFromEPSG(31467)
self.handled_fields = []
self.tracking_import = False
+ self.srcLayer = None
def getKind(self, path):
raise NotImplementedError("Importer.getKind is abstract!")
@@ -27,13 +30,48 @@
raise NotImplementedError("Importer.getTablename is abstract!")
def getName(self):
- raise NotImplementedError("Importer.getTablename is abstract!")
+ raise NotImplementedError("Importer.getName is abstract!")
+
+ def isGeometryValid(self, geomType):
+ raise NotImplementedError("Importer.isGeometryValid is abstract!")
+
+ def createNewFeature(self, featureDef, feat, **args):
+ raise NotImplementedError("Importer.createNewFeature is abstract!")
def IsFieldSet(self, feat, name):
+ if not name:
+ return False
if feat.GetFieldIndex(name) == -1:
return False # Avoids an Error in IsFieldSet
return feat.IsFieldSet(feat.GetFieldIndex(name))
+ def searchField(self, regex):
+ """
+ Searches for a field in the current src layer that matches
+ the expression regex.
+ Throws an exception if more than one field matches
+ @param feat: The feature to search for attributes
+ @param regex: The regex to look for
+
+ @returns: The field name as a string
+ """
+
+ if not hasattr(self.srcLayer, "fieldnames"):
+ self.srcLayer.fieldnames = []
+ for i in range(0, self.srcLayer.GetLayerDefn().GetFieldCount()):
+ self.srcLayer.fieldnames.append(
+ self.srcLayer.GetLayerDefn().GetFieldDefn(i).GetNameRef())
+
+ result = None
+ for name in self.srcLayer.fieldnames:
+ match = re.match(regex, name, re.IGNORECASE)
+ if match:
+ if result:
+ raise Exception("More than one field matches: %s" % regex)
+ else:
+ result = match.group(0)
+ return result
+
def IsDoubleFieldSet(self, feat, name):
try:
isset = feat.GetFieldAsDouble(name)
@@ -46,20 +84,23 @@
def walkOverShapes(self, shape):
(name, path) = shape
- if not self.isShapeRelevant(name, path):
- shpimporter.INFO("Skip shapefile '%s'" % path)
- return
shp = ogr.Open(shape[1])
if shp is None:
- shpimporter.ERROR("Shapefile '%s' could not be opened!" % path)
+ logger.error("Shapefile '%s' could not be opened!" % path)
return
- shpimporter.INFO("Processing shapefile '%s'" % path)
+ if not self.isShapeRelevant(name, path):
+ logger.info("Skip shapefile: '%s' of Type: %s" % (path,
+ utils.getWkbString(shp.GetLayerByName(name).GetGeomType())))
+ return
+
+
+ logger.info("Processing shapefile '%s'" % path)
srcLayer = shp.GetLayerByName(name)
if srcLayer is None:
- shpimporter.ERROR("Layer '%s' was not found!" % name)
+ logger.error("Layer '%s' was not found!" % name)
return
return self.shape2Database(srcLayer, name, path)
@@ -69,11 +110,12 @@
src_srs = geometry.GetSpatialReference()
if src_srs is None:
- shpimporter.ERROR("No source SRS given! No transformation possible!")
+ logger.error("No source SRS given! No transformation possible!")
return feat
transformer = osr.CoordinateTransformation(src_srs, self.dest_srs)
- geometry.Transform(transformer)
+ if geometry.Transform(transformer):
+ return None
return feat
@@ -90,15 +132,19 @@
"""
Checks the mapping dictonary for key value pairs to
copy from the source to the destination feature.
+ The keys can be reguar expressions that are matched
+ agains the source fieldnames
The Key is the attribute of the source feature to be copied
into the target attribute named by the dict's value.
"""
self.tracking_import = True
- self.handled_fields.extend(mapping.keys())
for key, value in mapping.items():
- if src.GetFieldIndex(key) == -1:
+ realname = self.searchField(key)
+ if realname == None:
continue
+ if not realname in self.handled_fields:
+ self.handled_fields.append(realname)
# 0 OFTInteger, Simple 32bit integer
# 1 OFTIntegerList, List of 32bit integers
# 2 OFTReal, Double Precision floating point
@@ -111,31 +157,32 @@
# 9 OFTDate, Date
# 10 OFTTime, Time
# 11 OFTDateTime, Date and Time
- if src.IsFieldSet(src.GetFieldIndex(key)):
- if src.GetFieldType(key) == 2:
- target.SetField(value, src.GetFieldAsDouble(key))
+ if src.IsFieldSet(src.GetFieldIndex(realname)):
+ if src.GetFieldType(realname) == 2:
+ target.SetField(value, src.GetFieldAsDouble(realname))
else:
- target.SetField(value, src.GetField(key))
+ target.SetField(value, utils.getUTF8(src.GetField(realname)))
def shape2Database(self, srcLayer, name, path):
destLayer = self.dbconn.GetLayerByName(self.getTablename())
if srcLayer is None:
- shpimporter.ERROR("Shapefile is None!")
+ logger.error("Shapefile is None!")
return -1
if destLayer is None:
- shpimporter.ERROR("No destination layer given!")
+ logger.error("No destination layer given!")
return -1
count = srcLayer.GetFeatureCount()
- shpimporter.DEBUG("Try to add %i features to database." % count)
+ logger.debug("Try to add %i features to database." % count)
srcLayer.ResetReading()
+ self.srcLayer = srcLayer
geomType = -1
success = 0
- unsupported = 0
+ unsupported = {}
creationFailed = 0
featureDef = destLayer.GetLayerDefn()
@@ -143,7 +190,7 @@
geom = feat.GetGeometryRef()
if geom is None:
- shpimporter.DEBUG("Unkown Geometry reference for feature")
+ logger.debug("Unkown Geometry reference for feature")
continue
geomType = geom.GetGeometryType()
@@ -151,25 +198,31 @@
if self.isGeometryValid(geomType):
newFeat = self.createNewFeature(featureDef,
feat,
- name=name,
+ name=utils.getUTF8(name),
path=path)
if newFeat is not None:
newFeat.SetField("path", utils.getUTF8Path(path))
newFeat = self.transform(newFeat)
- res = destLayer.CreateFeature(newFeat)
- if res is None or res > 0:
- shpimporter.ERROR("Unable to insert feature. Error: %r" % res)
+ if newFeat:
+ res = destLayer.CreateFeature(newFeat)
+ if res is None or res > 0:
+ logger.error("Unable to insert feature. Error: %r" % res)
+ else:
+ success = success + 1
else:
- success = success + 1
+ logger.error("Could not transform feature: %s " % feat.GetFID())
+ creationFailed += 1
else:
creationFailed = creationFailed + 1
else:
- unsupported = unsupported + 1
+ unsupported[utils.getWkbString(geomType)] = \
+ unsupported.get(utils.getWkbString(geomType), 0) + 1
- shpimporter.INFO("Inserted %i features" % success)
- shpimporter.INFO("Failed to create %i features" % creationFailed)
- shpimporter.INFO("Found %i unsupported features" % unsupported)
+ logger.info("Inserted %i features" % success)
+ logger.info("Failed to create %i features" % creationFailed)
+ for key, value in unsupported.items():
+ logger.info("Found %i unsupported features of type: %s" % (value, key))
if self.tracking_import:
unhandled = []
@@ -179,14 +232,14 @@
unhandled.append(act_field)
if len(unhandled):
- shpimporter.INFO("Did not import values from fields: %s " % \
+ logger.info("Did not import values from fields: %s " % \
" ".join(unhandled))
try:
- if self.config.dry_run > 0:
+ if self.dry_run:
return geomType
destLayer.CommitTransaction()
- except e:
- shpimporter.ERROR("Exception while committing transaction.")
+ except:
+ logger.error("Exception while committing transaction.")
return geomType
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/jetties.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/contrib/shpimporter/jetties.py Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,75 @@
+# -*- coding: utf-8 -*-
+import os
+
+try:
+ from osgeo import ogr
+except ImportError:
+ import ogr
+
+from importer import Importer
+import utils
+
+import logging
+logger = logging.getLogger("Jetties")
+
+PATH="Geodaesie/Bauwerke"
+NAME="Jetties"
+
+# strings need to be lowercase
+# buhnenkopf 0
+# buhnenfuà 1
+# buhnenwurzel 2
+JETTY_KIND = {
+ "bkl" : 0,
+ "bkr" : 0,
+ "bfl" : 1,
+ "bfr" : 1,
+ "bwl" : 2,
+ "bwr" : 2,
+ }
+
+class Jetties(Importer):
+ fieldmap = {
+ "^station$" : "km",
+ "^z$" : "z",
+ }
+
+ def getPath(self, base):
+ return "%s/%s" % (base, PATH)
+
+ def getTablename(self):
+ return "jetties"
+
+ def getName(self):
+ return "JETTIES"
+
+ def isGeometryValid(self, geomType):
+ return geomType == ogr.wkbPoint or geomType == ogr.wkbPoint25D
+
+ def isShapeRelevant(self, name, path):
+ if not path.endswith("Buhnen.shp"):
+ return False
+ shp = ogr.Open(path)
+ return self.isGeometryValid(shp.GetLayerByName(name).GetGeomType())
+
+ def createNewFeature(self, featureDef, feat, **args):
+ newFeat = ogr.Feature(featureDef)
+ geometry = feat.GetGeometryRef()
+ geometry.SetCoordinateDimension(2)
+
+ self.copyFields(feat, newFeat, self.fieldmap)
+
+ newFeat.SetGeometry(geometry)
+
+ artname = self.searchField("^type$")
+ if self.IsFieldSet(feat, artname):
+ self.handled(artname)
+ kind_id = JETTY_KIND.get(feat.GetField(artname).lower())
+ if kind_id == None:
+ logger.warn("Unknown Type: %s" % \
+ feat.GetField(artname))
+ else:
+ newFeat.SetField("kind_id", kind_id)
+
+ return newFeat
+
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/km.py
--- a/flys-backend/contrib/shpimporter/km.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/km.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,4 +1,7 @@
-import ogr
+try:
+ from osgeo import ogr
+except ImportError:
+ import ogr
from importer import Importer
@@ -26,12 +29,16 @@
def isShapeRelevant(self, name, path):
- return name == "km"
+ return name.lower() == "km"
def createNewFeature(self, featureDef, feat, **args):
newFeat = ogr.Feature(featureDef)
- newFeat.SetGeometry(feat.GetGeometryRef())
+
+ geometry = feat.GetGeometryRef()
+ geometry.SetCoordinateDimension(2)
+ newFeat.SetGeometry(geometry)
+
newFeat.SetField("name", args['name'])
if self.IsFieldSet(feat, "river_id"):
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/lines.py
--- a/flys-backend/contrib/shpimporter/lines.py Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,56 +0,0 @@
-import ogr
-
-from importer import Importer
-
-TABLE_NAME="lines"
-PATH="Geodaesie/Linien"
-NAME="Lines"
-
-
-class Line(Importer):
-
- def getPath(self, base):
- return "%s/%s" % (base, PATH)
-
-
- def getTablename(self):
- return TABLE_NAME
-
-
- def getName(self):
- return NAME
-
-
- def isGeometryValid(self, geomType):
- return geomType == 2 or geomType == -2147483646
-
-
- def isShapeRelevant(self, name, path):
- return True
-
-
- def createNewFeature(self, featureDef, feat, **args):
- newFeat = ogr.Feature(featureDef)
- geometry = feat.GetGeometryRef()
- geometry.SetCoordinateDimension(2)
-
- newFeat.SetGeometry(geometry)
- newFeat.SetField("name", args['name'])
-
- if self.IsFieldSet(feat, "river_id"):
- newFeat.SetField("river_id", feat.GetField("river_id"))
- else:
- newFeat.SetField("river_id", self.river_id)
-
- if self.IsFieldSet(feat, "TYP"):
- newFeat.SetField("kind", feat.GetFieldAsDouble("TYP"))
- else:
- newFeat.SetField("kind", "DAMM")
-
- if self.IsFieldSet(feat, "Z"):
- newFeat.SetField("z", feat.GetFieldAsDouble("Z"))
- else:
- newFeat.SetField("z", 9999)
-
- return newFeat
-
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/run.sh
--- a/flys-backend/contrib/shpimporter/run.sh Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/run.sh Fri Mar 22 11:25:54 2013 +0100
@@ -20,7 +20,6 @@
SKIP_HYDR_BOUNDARIES=0
SKIP_HWS=0
SKIP_GAUGE_LOCATION=0
-SKIP_CATCHMENTS=0
SKIP_UESG=0
exec python shpimporter.py \
@@ -41,6 +40,5 @@
--skip_hydr_boundaries $SKIP_HYDR_BOUNDARIES \
--skip_hws $SKIP_HWS \
--skip_gauge_locations $SKIP_GAUGE_LOCATION \
- --skip_catchments $SKIP_CATCHMENTS \
--skip_uesgs $SKIP_UESG
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/shpimporter.py
--- a/flys-backend/contrib/shpimporter/shpimporter.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/shpimporter.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,14 +1,16 @@
try:
from osgeo import ogr
-except ImportErrror:
+except ImportError:
import ogr
import utils, optparse
+import sys
+import os
+import logging
from uesg import UESG
from axis import Axis
from km import KM
-from lines import Line
from fixpoints import Fixpoint
from buildings import Building
from crosssectiontracks import CrosssectionTrack
@@ -16,44 +18,37 @@
from boundaries import HydrBoundary, HydrBoundaryPoly
from hws import HWSLines, HWSPoints
from gauges import GaugeLocation
-from catchments import Catchment
+from jetties import Jetties
+from dgm import insertRiverDgm
+logger = logging.getLogger("shpimporter")
-VERBOSE_DEBUG=2
-VERBOSE_INFO=1
+os.environ["NLS_LANG"] = ".AL32UTF8"
+def initialize_logging(level):
+ """Initializes the logging system"""
+ root = logging.getLogger()
+ root.setLevel(level)
+ hdlr = logging.StreamHandler()
+ fmt = logging.Formatter("%(levelname)s %(name)s: %(message)s")
+ hdlr.setFormatter(fmt)
+ root.addHandler(hdlr)
-def DEBUG(msg):
- config = getConfig()
- if config.verbose >= VERBOSE_DEBUG:
- print "DEBUG: %s" % msg
-
-def INFO(msg):
- config = getConfig()
- if config.verbose >= VERBOSE_INFO:
- print "INFO: %s" % msg
-
-def ERROR(msg):
- config = getConfig()
- print "ERROR: %s" % msg
-
-
-def getImporters(config, dbconn):
+def getImporters(river_id, dbconn, dry_run):
return [
- Axis(config, dbconn),
- KM(config, dbconn),
- CrosssectionTrack(config, dbconn),
- Line(config, dbconn),
- Fixpoint(config, dbconn),
- Building(config, dbconn),
- Floodplain(config, dbconn),
- HydrBoundary(config, dbconn),
- HydrBoundaryPoly(config, dbconn),
- HWSLines(config, dbconn),
- HWSPoints(config, dbconn),
- GaugeLocation(config, dbconn),
- Catchment(config, dbconn),
- UESG(config, dbconn)
+ Axis(river_id, dbconn, dry_run),
+ KM(river_id, dbconn, dry_run),
+ CrosssectionTrack(river_id, dbconn, dry_run),
+ Fixpoint(river_id, dbconn, dry_run),
+ Building(river_id, dbconn, dry_run),
+ Floodplain(river_id, dbconn, dry_run),
+ HydrBoundary(river_id, dbconn, dry_run),
+ HydrBoundaryPoly(river_id, dbconn, dry_run),
+ HWSLines(river_id, dbconn, dry_run),
+ HWSPoints(river_id, dbconn, dry_run),
+ GaugeLocation(river_id, dbconn, dry_run),
+ Jetties(river_id, dbconn, dry_run),
+ UESG(river_id, dbconn, dry_run)
]
@@ -64,7 +59,7 @@
parser.add_option("--host", type="string")
parser.add_option("--user", type="string")
parser.add_option("--password", type="string")
- parser.add_option("--river_id", type="int")
+ parser.add_option("--river_name", type="string")
parser.add_option("--verbose", type="int", default=1)
parser.add_option("--dry_run", type="int", default=0)
parser.add_option("--ogr_connection", type="string")
@@ -72,33 +67,37 @@
parser.add_option("--skip_hydr_boundaries", type="int")
parser.add_option("--skip_buildings", type="int")
parser.add_option("--skip_crosssections", type="int")
- parser.add_option("--skip_lines", type="int")
parser.add_option("--skip_fixpoints", type="int")
parser.add_option("--skip_floodplains", type="int")
parser.add_option("--skip_hws_lines", type="int")
parser.add_option("--skip_hws_points", type="int")
parser.add_option("--skip_gauge_locations", type="int")
- parser.add_option("--skip_catchments", type="int")
parser.add_option("--skip_kms", type="int")
parser.add_option("--skip_uesgs", type="int")
+ parser.add_option("--skip_dgm", type="int")
+ parser.add_option("--skip_jetties", type="int")
(config, args) = parser.parse_args()
+ if config.verbose > 1:
+ initialize_logging(logging.DEBUG)
+ elif config.verbose == 1:
+ initialize_logging(logging.INFO)
+ else:
+ initialize_logging(logging.WARN)
+
if config.directory == None:
- ERROR("No river directory specified!")
+ logger.error("No river directory specified!")
raise Exception("Invalid config")
if not config.ogr_connection:
if not config.host:
- ERROR("No database host specified!")
+ logger.error("No database host specified!")
raise Exception("Invalid config")
if not config.user:
- ERROR("No databaser user specified!")
+ logger.error("No databaser user specified!")
raise Exception("Invalid config")
if not config.password:
- ERROR("No password specified!")
+ logger.error("No password specified!")
raise Exception("Invalid config")
- if config.river_id == None:
- ERROR("No river id specified!")
- raise Exception("Invalid config")
return config
@@ -114,19 +113,18 @@
return True
elif config.skip_crosssections == 1 and isinstance(importer, CrosssectionTrack):
return True
- elif config.skip_lines == 1 and isinstance(importer, Line):
- return True
elif config.skip_fixpoints == 1 and isinstance(importer, Fixpoint):
return True
elif config.skip_floodplains == 1 and isinstance(importer, Floodplain):
return True
- elif config.skip_hws_points == 1 and isinstance(importer, HWSPoints):
+ elif config.skip_hws_lines == 1 and isinstance(importer, HWSLines):
return True
- elif config.skip_hws_lines == 1 and isinstance(importer, HWSLines):
+ elif config.skip_hws_points == 1 and isinstance(importer, HWSPoints) and \
+ not isinstance(importer, HWSLines):
return True
elif config.skip_gauge_locations == 1 and isinstance(importer, GaugeLocation):
return True
- elif config.skip_catchments == 1 and isinstance(importer, Catchment):
+ elif config.skip_jetties == 1 and isinstance(importer, Jetties):
return True
elif config.skip_kms == 1 and isinstance(importer, KM):
return True
@@ -135,7 +133,6 @@
return False
-
def main():
config=None
try:
@@ -144,48 +141,110 @@
return -1
if config == None:
- ERROR("Unable to read config from command line!")
+ logger.error("Unable to read config from command line!")
return
if config.dry_run > 0:
- INFO("You enable 'dry_run'. No database transaction will take place!")
+ logger.info("You enable 'dry_run'. No database transaction will take place!")
if config.ogr_connection:
connstr = config.ogr_connection
else:
connstr = 'OCI:%s/%s@%s' % (config.user, config.password, config.host)
+ oracle = False # Marker if oracle is used.
+ if 'OCI:' in connstr:
+ oracle = True
+ try:
+ import cx_Oracle as dbapi
+ raw_connstr=connstr.replace("OCI:", "")
+ except ImportError:
+ logger.error("Module cx_Oracle not found in: %s\n"
+ "Neccessary to connect to a Oracle Database.\n"
+ "Please refer to the installation "
+ "documentation." % sys.path)
+ return -1
+
+ else: # Currently only support for oracle and postgres
+ try:
+ import psycopg2 as dbapi
+ raw_connstr=connstr.replace("PG:", "")
+ except ImportError:
+ logger.error("Module psycopg2 not found in: %s\n"
+ "Neccessary to connect to a Posgresql Database.\n"
+ "Please refer to the installation "
+ "documentation." % sys.path)
+ return -1
+
+ dbconn_raw = dbapi.connect(raw_connstr)
dbconn = ogr.Open(connstr)
if dbconn == None:
- ERROR("Could not connect to database %s" % connstr)
+ logger.error("Could not connect to database %s" % connstr)
return -1
- importers = getImporters(config, dbconn)
types = {}
- for importer in importers:
- if skip_importer(config, importer):
- INFO("Skip import of '%s'" % importer.getName())
+ directories = []
+ if not config.river_name:
+ for file in [os.path.join(config.directory, d) for d in \
+ os.listdir(config.directory)]:
+ if os.path.isdir(file):
+ directories.append(file)
+ else:
+ directories.append(config.directory)
+
+ for directory in directories:
+ if not config.river_name:
+ river_name = utils.getUTF8Path(
+ os.path.basename(os.path.normpath(directory)))
+ else:
+ river_name = config.river_name
+ river_id = utils.getRiverId(dbconn_raw, river_name, oracle)
+
+ if not river_id:
+ logger.info(u"Could not find river in database. Skipping: %s"
+ % unicode(utils.getUTF8(river_name), "UTF-8"))
continue
+ else:
+ logger.info(u"Importing River: %s" % unicode(
+ utils.getUTF8(river_name), "UTF-8"))
- INFO("Start import of '%s'" % importer.getName())
+ for importer in getImporters(river_id, dbconn, config.dry_run):
+ if skip_importer(config, importer):
+ logger.info("Skip import of '%s'" % importer.getName())
+ continue
- shapes = utils.findShapefiles(importer.getPath(config.directory))
- DEBUG("Found %i Shapefiles" % len(shapes))
+ logger.info("Start import of '%s'" % importer.getName())
- for shpTuple in shapes:
- geomType = importer.walkOverShapes(shpTuple)
- try:
- if geomType is not None:
- num = types[geomType]
- types[geomType] = num+1
- except:
- types[geomType] = 1
+ shapes = utils.findShapefiles(importer.getPath(config.directory))
+ logger.debug("Found %i Shapefiles" % len(shapes))
- for key in types:
- DEBUG("%i x geometry type %s" % (types[key], key))
+ for shpTuple in shapes:
+ geomType = importer.walkOverShapes(shpTuple)
+ try:
+ if geomType is not None:
+ num = types[geomType]
+ types[geomType] = num+1
+ except:
+ types[geomType] = 1
+ for key in types:
+ logger.debug("%i x geometry type %s" % (types[key], key))
+
+ if not config.skip_dgm:
+ dgmfilename = os.path.join(
+ config.directory, "..", "DGMs.csv")
+ if not os.access(dgmfilename, os.R_OK) or not \
+ os.path.isfile(dgmfilename):
+ logger.info("Could not find or access DGM file: %s \n"
+ "Skipping DGM import." % dgmfilename)
+ else:
+ logger.info("Inserting DGM meta information in 'dem' table.")
+ insertRiverDgm(dbconn_raw, dgmfilename, river_name,
+ config.dry_run, oracle)
+ else:
+ logger.info("Skip import of DGM.")
if __name__ == '__main__':
main()
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/uesg.py
--- a/flys-backend/contrib/shpimporter/uesg.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/uesg.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,10 +1,14 @@
-import ogr
+try:
+ from osgeo import ogr
+except ImportError:
+ import ogr
from importer import Importer
-
+import os.path
+import utils
TABLE_NAME="floodmaps"
-PATH="Hydrologie/UeSG/Berechnung"
+PATH="Hydrologie/UeSG"
NAME="UESG"
@@ -23,11 +27,8 @@
def isGeometryValid(self, geomType):
- if geomType == 3 or geomType == 6:
- return True
- else:
- return False
-
+ return geomType in [ogr.wkbMultiPolygon,
+ ogr.wkbPolygon]
def getKind(self, path):
kind = 0
@@ -39,7 +40,7 @@
else:
kind = kind + 20
- if path.find("Land") > 0:
+ if path.find("Bundesl") > 0:
kind = kind + 2
else:
kind = kind + 1
@@ -51,7 +52,6 @@
def createNewFeature(self, featureDef, feat, **args):
kind = self.getKind(args['path'])
-
newFeat = ogr.Feature(featureDef)
newFeat.SetGeometry(feat.GetGeometryRef())
@@ -80,6 +80,11 @@
else:
perimeter = 0
+ if kind >= 200:
+ newFeat.SetField("source",
+ os.path.basename(os.path.dirname(args['path'])))
+
+
groupId = 2
newFeat.SetField("river_id", riverId)
@@ -90,5 +95,5 @@
newFeat.SetField("kind", kind)
newFeat.SetField("name", args['name'])
- return newFeat
+ return utils.convertToMultiPolygon(newFeat)
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/contrib/shpimporter/utils.py
--- a/flys-backend/contrib/shpimporter/utils.py Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/contrib/shpimporter/utils.py Fri Mar 22 11:25:54 2013 +0100
@@ -1,8 +1,17 @@
import os
import sys
-from shpimporter import DEBUG, INFO, ERROR
+import logging
+
+try:
+ from osgeo import ogr
+except ImportError:
+ import ogr
+
+logger = logging.getLogger("utils")
SHP='.shp'
+SQL_SELECT_RIVER_ID="SELECT id FROM rivers WHERE name = %s"
+SQL_SELECT_RIVER_ID_ORA="SELECT id FROM rivers WHERE name = :s"
def findShapefiles(path):
shapes = []
@@ -11,7 +20,7 @@
if len(files) == 0:
continue
- DEBUG("Processing directory '%s' with %i files " % (root, len(files)))
+ logger.debug("Processing directory '%s' with %i files " % (root, len(files)))
for f in files:
idx = f.find(SHP)
@@ -20,6 +29,36 @@
return shapes
+def getRiverId(dbconn, name, oracle):
+ """
+ Returns the id of the river "name"
+ Dbconn must be a python database connection api compliant object
+ """
+ cur = dbconn.cursor()
+ if oracle:
+ # This is stupid and shoudl not be neccessary. But I don't
+ # know how to make it work both ways. aheinecke - 02/2013
+ stmt = SQL_SELECT_RIVER_ID_ORA
+ else:
+ stmt = SQL_SELECT_RIVER_ID
+ cur.execute(stmt, (getUTF8(name),))
+ row = cur.fetchone()
+ if row:
+ return row[0]
+ else:
+ return 0
+
+def getUTF8(string):
+ """
+ Tries to convert the string to a UTF-8 encoding by first checking if it
+ is UTF-8 and then trying cp1252
+ """
+ try:
+ return unicode.encode(unicode(string, "UTF-8"), "UTF-8")
+ except UnicodeDecodeError:
+ # Probably European Windows names so lets try again
+ return unicode.encode(unicode(string, "cp1252"), "UTF-8")
+
def getUTF8Path(path):
"""
Tries to convert path to utf-8 by first checking the filesystemencoding
@@ -31,3 +70,53 @@
except UnicodeDecodeError:
# Probably European Windows names so lets try again
return unicode.encode(unicode(path, "cp1252"), "UTF-8")
+
+WKB_MAP = {
+ ogr.wkb25Bit : 'wkb25Bit',
+ ogr.wkbGeometryCollection : 'wkbGeometryCollection',
+ ogr.wkbGeometryCollection25D :'wkbGeometryCollection25D',
+ ogr.wkbLineString : 'wkbLineString',
+ ogr.wkbLineString25D : 'wkbLineString25D',
+ ogr.wkbLinearRing : 'wkbLinearRing',
+ ogr.wkbMultiLineString : 'wkbMultiLineString',
+ ogr.wkbMultiLineString25D : 'wkbMultiLineString25D',
+ ogr.wkbMultiPoint : 'wkbMultiPoint',
+ ogr.wkbMultiPoint25D : 'wkbMultiPoint25D',
+ ogr.wkbMultiPolygon : 'wkbMultiPolygon',
+ ogr.wkbMultiPolygon25D : 'wkbMultiPolygon25D',
+ ogr.wkbNDR : 'wkbNDR',
+ ogr.wkbNone : 'wkbNone',
+ ogr.wkbPoint : 'wkbPoint',
+ ogr.wkbPoint25D : 'wkbPoint25D',
+ ogr.wkbPolygon : 'wkbPolygon',
+ ogr.wkbPolygon25D : 'wkbPolygon25D',
+ ogr.wkbUnknown : 'wkbUnknown',
+ ogr.wkbXDR : 'wkbXDR'
+}
+
+def getWkbString(type):
+ return WKB_MAP.get(type) or "Unknown"
+
+def convertToMultiLine(feature):
+ """
+ Converts a feature to a multiline feature.
+ """
+ geometry = feature.GetGeometryRef()
+ # SRS information is lost while forcing to multiline
+ srs = geometry.GetSpatialReference()
+ geometry = ogr.ForceToMultiLineString(geometry)
+ geometry.AssignSpatialReference(srs)
+ feature.SetGeometry(geometry)
+ return feature
+
+def convertToMultiPolygon(feature):
+ """
+ Converts a feature to a multiline feature.
+ """
+ geometry = feature.GetGeometryRef()
+ # SRS information is lost while forcing to multiline
+ srs = geometry.GetSpatialReference()
+ geometry = ogr.ForceToMultiPolygon(geometry)
+ geometry.AssignSpatialReference(srs)
+ feature.SetGeometry(geometry)
+ return feature
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/documentation/de/importer-geodaesie.tex
--- a/flys-backend/doc/documentation/de/importer-geodaesie.tex Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/documentation/de/importer-geodaesie.tex Fri Mar 22 11:25:54 2013 +0100
@@ -1,10 +1,14 @@
\section{Geodatenimport}
-Der Geodaten Importer ist ein in Python geschriebenes Kommandozeilen Tool zum
-Import von Shapefiles in eine Datenbank. Zum Lesen der Shapefiles und zum
-Schreiben der Geodaten in die Datenbank wird GDAL verwendet. Der Import in eine
-Oracle Datenbank erfordert, dass GDAL und GDAL Python Bindungs mit
-Oracle Unterstützung installiert sind. Weitere Details hierzu befinden sich im
+Der Geodaten Importer ist ein in der Programmiersprache Python
+geschriebenes Kommandozeilen Werkzeug zum Import von Shapefiles in
+eine Datenbank.
+Zum Lesen der Shapefiles und zum schreiben der Geodaten
+in die Datenbank wird die GDAL Bibliothek verwendet.
+Um Daten in eine Oracle Datenbank zu importieren ist es nötig, dass
+GDAL und GDAL Python Bindungs mit Oracle Unterstützung installiert
+sind. Bei der Verwendung von PostgreSQL entfällt dieser Schritt.
+Weitere Details hierzu befinden sich im
Kapitel \ref{Systemanforderungen} und \ref{Installationsanleitung}.
Der Importer kann mit einem Shellscript von der Kommandozeile gestartet werden
@@ -13,23 +17,32 @@
importiert werden sollen. Für jede Klasse gibt es einen speziellen
Parser, der die speziellen Attribute eines Shapefiles liest und in die entsprechende
Relation der Datenbank schreibt. Die Parser sind speziell auf das
-Dateisystem der BfG ausgerichtet. So wird z.B. erwartet, dass die Shapefiles der
+Dateisystem der BfG ausgerichtet. So wird beispielsweise erwartet, dass die Shapefiles der
Gewässerachse im Ordner $Geodaesie/Flussachse+km$ liegen. Weitere Informationen zu
den einzelnen Parsern sind dem nächsten Kapitel \ref{Beschreibung der Parser} zu
entnehmen. Der Erfolg oder Misserfolg eines Shape-Imports wird je nach
Konfiguration im Logfile vermerkt. Folgende Einträge können dem Logfile
entnommen werden:
+%TODO etwas zum srs schreiben.
+
\textbf{INFO: Inserted 4 features}
\\Gibt die Anzahl der erfolgreich importierten Features an.\\
\textbf{INFO: Failed to create 2 features}
\\Gibt die Anzahl der Features an, die nicht importiert werden konnten.\\
-\textbf{INFO: Found 3 unsupported features}
+\textbf{INFO: Found 3 unsupported features of type: wbkMultiLineString}
\\Gibt die Anzahl der Features an, die aufgrund ihres Datentyps nicht importiert
-werden konnten. Z.B: es werden Linien erwartet, im Shapefile sind jedoch
-Polygone enthalten.\\
+werden konnten. Wenn etwa Punkte erwartet wurden aber sich im Shapefile
+Polygone befanden.\\
+
+\textbf{INFO: Did not import values from fields: TYP ID GRUENDUNG BHW}
+\\Manche Importer versuchen neben der Geographischen Information weitere
+Felder in die Datenbank einzulesen. Um festzustellen ob ein Feld aufgrund
+von Tippfehlern oder unterschiedlicher Schreibweise nicht importiert wurde,
+gibt diese Information Auskunft darüber welche Felder aus der Shape Datei
+nicht verwendet wurden.\\
\textbf{ERROR: No source SRS given! No transformation possible!}
\\Das Shapefile enthält keine Information, in welcher Projektion die Geometrien
@@ -88,7 +101,7 @@
\begin{tabular}[t]{ll}
Pfad & Hydrologie/Hydr.Grenzen/Linien \\
Geometrie & LINESTRING, POLYGON \\
-Attribute & name, kind \\
+Attribute & name, kind, sectie, sobek \\
\end{tabular}
\subsubsection{Bauwerke}
@@ -149,8 +162,9 @@
\hspace{1cm}
\begin{tabular}[t]{ll}
Pfad & Hydrologie/HW-Schutzanlagen \\
-Geometrie & LINESTRING \\
-Attribute & TYP, Bauart, Name, name \\
+Geometrie & LINESTRING, POINT \\
+Attribute & name, source, description, status\_date, agency,
+ dike\_km, range, z\_target, rated\_level, z \\
\end{tabular}
@@ -163,19 +177,6 @@
\end{tabular}
-\subsubsection{Linien}
-\hspace{1cm}
-\begin{tabular}[t]{ll}
-Pfad & Geodaesie/Linien \\
-Geometrie & LINESTRING, MULTILINESTRING \\
-Attribute & name, TYP, Z \\
-
-Anmerkung & Wenn kein Attribut 'TYP' definiert ist, wird standardmäßig der Wert \\
- & 'DAMM' angenommen. Fehlt ein Attribut 'Z' wird '9999' als Höhe \\
- & angenommen. \\
-\end{tabular}
-
-
\subsubsection{Überschwemmungsfläche}
\hspace{1cm}
\begin{tabular}[t]{ll}
@@ -184,79 +185,6 @@
Attribut & name, diff, count, area, perimeter \\
\end{tabular}
-
-\subsection{Systemanforderungen}
-\label{Systemanforderungen}
-\begin{itemize}
- \item Oracle Datenbank inkl. Schema für FLYS
- \item GDAL Binding für Python mit Oracle Support
- \item ogr2ogr
- \item Python $>=$ 2.6
-\end{itemize}
-
-
-\subsection{Installationsanleitung}
-\label{Installationsanleitung}
-\begin{itemize}
-
- \item Python\\
- Zum Starten des Importers ist es notwendig Python zu installieren. Dies können
- Sie mit folgendem Befehl auf der Kommandozeile erledigen:
-
- \begin{lstlisting}
- zypper in python
- \end{lstlisting}
-
- \item Oracle Instantclient\\
- Der Oracle Instantclient 11.2 wird benötigt, damit der Importer mittels Python
- und GDAL in die bestehende Oracle Datenbank schreiben kann. Dazu ist es
- erforderlich, folgende Archive von Oracle herunterzuladen. Zu finden sind die
- folgenden Pakete unter\\
- \href{http://www.oracle.com/technetwork/topics/linuxx86-64soft-092277.html}{http://www.oracle.com/technetwork/topics/linuxx86-64soft-092277.html}
-
- \begin{itemize}
- \item instantclient-basic-linux-x86-64-11.2.0.2.0.zip
- \item instantclient-sdk-linux-x86-64-11.2.0.2.0.zip
- \item instantclient-sqlplus-linux-x86-64-11.2.0.2.0.zip
- \end{itemize}
-
- Anschließend führen Sie folgende Befehle auf der Kommandozeile aus:
-
- \begin{lstlisting}
-
- mkdir /opt
-
- unzip ~/instantclient-basic-linux-x86-64-11.2.0.2.0.zip -d /opt
- unzip ~/instantclient-sdk-linux-x86-64-11.2.0.2.0.zip -d /opt
- unzip ~/instantclient-sqlplus-linux-x86-64-11.2.0.2.0.zip -d /opt
-
- mkdir /opt/instantclient_11_2/lib
- cd /opt/instantclient_11_2/lib
- ln -s ../libclntsh.so.11.1 .
- ln -s ../libclntsh.so.11.1 libclntsh.so
- ln -s ../libnnz11.so .
- ln -s ../libocci.so.11.1 .
- ln -s ../libocci.so.11.1 libocci.so
- ln -s ../libociei.so .
- ln -s ../libocijdbc11.so .
- ln -s ../libsqlplusic.so .
- ln -s ../libsqlplus.so .
-
- rpm -i --nodeps ~/flys-importer/rpm/RPMS/x86_64/libgdal1180-1.8.0-intevation1.x86_64.rpm
- rpm -i --nodeps ~/flys-importer/rpm/RPMS/x86_64/libgdal180-devel-1.8.0-intevation1.x86_64.rpm
- rpm -i --nodeps ~/flys-importer/rpm/RPMS/x86_64/gdal180-1.8.0-intevation1.x86_64.rpm
-
- \end{lstlisting}
-
- Sollten keine Fehler aufgetreten sein, haben Sie den \textit{Oracle
- Instantclient 11.2} erfolgreich entpackt und im Dateisystem unter
- \textit{/opt/instantclient\_11\_2} abgelegt. Mit den Befehlen $rpm -i --nodeps$
- haben Sie anschließend die notwendigen Bindings installiert, damit der Importer
- die Geodaten in die Oracle Datenbank schreiben kann.
-
-\end{itemize}
-
-
\subsection{Konfiguration}
\label{Konfiguration}
Der Geodaten Importer kann über die Datei \textit{contrib/run\_geo.sh}
@@ -267,12 +195,11 @@
\textbf{RIVER\_PATH}
\\Der Pfad zum Gewässer im Dateisystem.
-\textbf{RIVER\_ID}
-\\Die Datenbank ID des zu importierenden Gewässers.
-
-\textbf{TARGET\_SRS}
-\\Das EPSG Referenzsystem in das die Geodaten beim Import projeziert werden
-sollen.
+\textbf{RIVER\_NAME}
+\\Der Datenbank Name des zu importierenden Gewässers. Wird dieser Parameter
+nicht übergeben werden die Ordnernamen im mit dem Parameter RIVER\_PATH
+angegebenen Verzeichnis als Flussnamen interpretiert und es wird versucht
+diese zu Importieren.
\textbf{HOST}
\\Der Host der Datenbank.
@@ -312,9 +239,6 @@
\textbf{SKIP\_CROSSSECTIONS}
\\Bei gesetztem Wert `1` werden keine Querprofilespuren importiert.
-\textbf{SKIP\_LINES}
-\\Bei gesetztem Wert `1` werden keine Linien importiert.
-
\textbf{SKIP\_FIXPOINTS}
\\Bei gesetztem Wert `1` werden keine Festpunkte importiert.
@@ -333,15 +257,17 @@
\textbf{SKIP\_HWS\_POINTS}
\\Bei gesetztem Wert `1` werden kein Hochwasserschutz Punktdaten importiert.
-\textbf{SKIP\_GAUGE\_LOCATION}
-\\Bei gesetztem Wert `1` werden keine Pegelorte importiert.
-
\textbf{SKIP\_CATCHMENTS}
\\Bei gesetztem Wert `1` werden keine Einzugsgebiete importiert.
\textbf{SKIP\_UESG}
\\Bei gesetztem Wert `1` werden keine Überschwemmungsflächen importiert.
+\textbf{SKIP\_DGM}
+\\Bei gesetztem Wert `1` werden keine Informationen über Digitale Geländemodelle importiert.
+
+\textbf{SKIP\_JETTIES}
+\\Bei gesetztem Wert `1` werden keine Informationen über Buhnen importiert.
\subsection{Starten des Geodaten Importers}
\label{Starten des Geodaten Importers}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/documentation/de/importer-hydr-morph.tex
--- a/flys-backend/doc/documentation/de/importer-hydr-morph.tex Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/documentation/de/importer-hydr-morph.tex Fri Mar 22 11:25:54 2013 +0100
@@ -107,7 +107,7 @@
ausgeschlossen.
\subsubsection{Profilspuren (*.w80-Dateien)}
-Der Import von W80-Profilspuren kann mit \textbf{-Dflys.backend.importer.skip.w80s=true}
+Der Import von W80-Profilspuren kann mit \textbf{-Dflys.backend.importer.skip.w80=true}
unterdrückt werden. Es werden rekursiv alle *.w80-Dateien aus \textit{../../..}
relativ zur gewaesser.wst-Datei betrachtet. Vor dem Import werden mit Hilfe
eines Längen- und eines MD5-Summen-Vergleichs inhaltliche Duplikate
@@ -760,7 +760,7 @@
gestartet. Dazu führen folgenden Befehl aus:\\
\begin{lstlisting}
- contrib/run_hydr_morph.sh
+ ./run_hydr_morph.sh
\end{lstlisting}
Nachdem der Prompt der Konsole zurückkehrt, ist der Import abgeschlossen oder es
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/documentation/de/importer-manual.tex
--- a/flys-backend/doc/documentation/de/importer-manual.tex Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/documentation/de/importer-manual.tex Fri Mar 22 11:25:54 2013 +0100
@@ -26,9 +26,9 @@
% Document DATE and VERSION
% set these values when releasing a new version
-\newcommand{\documentdate}{30. August 2012}
-\newcommand{\documentversion}{1.0}
-\newcommand{\documentrevision}{rev5303}
+\newcommand{\documentdate}{19. Februar 2013}
+\newcommand{\documentversion}{1.1}
+\newcommand{\documentrevision}{rev5062}
\newcommand{\documentID}{importer-manual.tex}
%----------------------------------------------
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/documentation/de/overview.tex
--- a/flys-backend/doc/documentation/de/overview.tex Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/documentation/de/overview.tex Fri Mar 22 11:25:54 2013 +0100
@@ -49,8 +49,15 @@
Bitte beachten Sie, dass diese Werkzeuge nicht zur Installtion und zum Betrieb
der Software selbst notwendig sind!
+\subsection{Systemanforderungen}
+\label{Systemanforderungen}
+\begin{itemize}
+ \item Oracle oder PosgreSQL Datenbank inkl. Schema für FLYS
+ \item SUSE Enterprise Linux 11.2 SP 1
+\end{itemize}
-\subsubsection{Vorbereiten der Datenbank}
+\subsection{Installationsanleitung}
+\label{Installationsanleitung}
Nachdem Sie das Paket nun in das Heimatverzeichnis des Nutzers auf das
Zielsystem kopiert haben, entpacken Sie es mit folgenden Befehlen:
@@ -61,6 +68,95 @@
cd flys-importer
\end{lstlisting}
+\subsubsection{Java}
+Der flys-importer benötigt Java Version 6 um diese zu installieren laden Sie
+bitte von \url{http://www.oracle.com/technetwork/java/javase/downloads/jdk6downloads-1902814.html}
+eine aktulle Java Version als -rpm.bin herunter. Zum Beispiel: jdk-6u41-linux-x64-rpm.bin
+
+Nach dem Herunterladen, öffnen Sie eine konsole und wechseln in das Downloadverzeichnis.
+Führen Sie nun folgende Befehle aus:
+
+ \begin{lstlisting}
+ su - # login als root
+ sh jdk-6u41-linux-x64-rpm.bin
+ <bestaetigen mit enter>
+ update-alternatives --install /usr/bin/java java /usr/java/jdk1.6.0_41/bin/java 5
+ update-alternatives --install /etc/alternatives/jre jre /usr/java/jdk1.6.0_41/jre 5
+ update-alternatives --config java
+ \end{lstlisting}
+
+\subsubsection{Python und GDAL}
+Installieren Sie nun die restlichen benötigten Pakete.
+Dazu installieren Sie zuerst einige Abhängigkeiten und anschließend die
+von der Intevation GmbH bereitgestellten speziellen Versionen von proj und libgeos
+
+Um die Abhängigkeiten zu installieren führen Sie bitte folgende Befehle aus:
+
+ \begin{lstlisting}
+ zypper ar http://download.opensuse.org/repositories/home:/intevation:/bfg/SLE_11/ "intevation:bfg"
+ rpm --import http://download.opensuse.org/repositories/home:/intevation:/bfg/SLE_11/repodata/repomd.xml.key
+ zypper ref # Paketlist neu laden
+ zypper in python libgeos0 libproj0 proj netcdf libnetcdf4 \
+ xerces-c libxerces-c-3_0 unixODBC postgresql-libs
+ zypper mr -d "intevation:bfg"
+ zypper ref # Paketliste neu laden
+ \end{lstlisting}
+
+%\subsubsection Oracle Instantclient\\
+%Der Oracle Instantclient 11.2.0.2.0 wird benötigt, damit der Importer mittels Python
+%und GDAL in die bestehende Oracle Datenbqlnk schreiben kann. Wenn Sie
+%eine PosgreSQL Datenbank verwenden, können Sie diesen Schritt überspringen.
+%
+%Zur Anbindung von Oracle ist es erforderlich, folgende Archive von
+%Oracle herunterzuladen (Sie benötigen dafür ein Oracle Benutzerkonto):
+%
+%Der Oracle Instantclient 11.2 wird benötigt, damit der Importer mittels Python
+%und GDAL in die bestehende Oracle Datenbank schreiben kann. Dazu ist es
+%erforderlich, folgende Archive von Oracle herunterzuladen.
+%
+% \begin{itemize}
+% \item instantclient-basic-linux-x86-64-11.2.0.2.0.zip
+% \item instantclient-sdk-linux-x86-64-11.2.0.2.0.zip
+% \item instantclient-sqlplus-linux-x86-64-11.2.0.2.0.zip
+% \end{itemize}
+%
+%Zu finden sind die
+% Pakete unter:\\
+%\href{http://www.oracle.com/technetwork/topics/linuxx86-64soft-092277.html}
+%{http://www.oracle.com/technetwork/topics/linuxx86-64soft-092277.html}
+%
+%
+%Um diese Pakete zu installieren, öffnen Sie eine Konsole und wechseln
+%in das Verzeichnis in welches Sie die heruntergeladenen Pakete
+%gespeichert haben. (z.B.: cd /home/benutzername/Downloads )
+% Anschließend führen Sie folgende Befehle auf der Kommandozeile aus:
+%
+% \begin{lstlisting}
+% unzip instantclient-basic-linux-x86-64-11.2.0.2.0.zip -d /opt
+% unzip instantclient-sdk-linux-x86-64-11.2.0.2.0.zip -d /opt
+% unzip instantclient-sqlplus-linux-x86-64-11.2.0.2.0.zip -d /opt
+%
+% mkdir /opt/instantclient_11_2/lib
+% cd /opt/instantclient_11_2/lib
+% ln -s ../libclntsh.so.11.1 .
+% ln -s ../libclntsh.so.11.1 libclntsh.so
+% ln -s ../libnnz11.so .
+% ln -s ../libocci.so.11.1 .
+% ln -s ../libocci.so.11.1 libocci.so
+% ln -s ../libociei.so .
+% ln -s ../libocijdbc11.so .
+% ln -s ../libsqlplusic.so .
+% ln -s ../libsqlplus.so .
+%
+% echo "/opt/instantclient_11_2/lib/" > /etc/ld.so.conf.d/oci.conf
+% ldconfig
+% \end{lstlisting}
+%
+%Sollten keine Fehler aufgetreten sein, haben Sie den \textit{Oracle
+% Instantclient 11.2} erfolgreich entpackt und im Dateisystem unter
+% \textit{/opt/instantclient\_11\_2} abgelegt.
+%
+\subsubsection{Vorbereiten der Datenbank}
Bevor die Importer verwendet werden können, ist es notwendig, dass eine leere
Oracle Datenbank vorhanden ist. Anschließend müssen folgende SQL Skripte in
diese Datenbank eingespielt werden:
@@ -82,24 +178,27 @@
Mittels diesem SQL Skript werden die Indizes zum geodätischen Datenbankschema\\
hinzugefügt.
-\item import-dems.sql \\
-In diesem Skript sind Befehle zum Einfügen der digitalen Geländemodelle
-enthalten. Die Dateipfade in diesem Skript sind so anzupassen, dass sie auf die
-entsprechenden Geländemodelle im Dateisystem verweisen. Es ist notwendig die
-Pfade absolut anzugeben.
-
\end{enumerate}
Zum Einspielen dieser Schemata setzen Sie folgende Befehle auf der Kommandozeile
ab. Beachten Sie, dass $sqlplus$ im Pfad liegen muss, und der Linux-Nutzer
dieses Kommando ausführen können muss. Außerdem sind $benutzername$ und $passwort$
entsprechend Ihres Datenbank-Zugangs anzupassen.
+SQLPlus befindet sich in /opt/instantclient\_11\_2 um es verfügbar zu machen
+führen Sie im Importer Verzeichnis folgende Befehle aus:
+
+\begin{lstlisting}
+export LD_LIBRARY_PATH=opt/instantclient\_11\_2/lib:$LD_LIBRARY_PATH
+export PATH=opt/instantclient\_11\_2:$PATH
+\end{lstlisting}
+
+Nun erstellen Sie das Schema:
\begin{lstlisting}
sqlplus benutzername/passwort @schema/oracle.sql
sqlplus benutzername/passwort @schema/oracle-minfo.sql
sqlplus benutzername/passwort @schema/oracle-spatial.sql
sqlplus benutzername/passwort @schema/oracle-spatial_idx.sql
- sqlplus benutzername/passwort @schema/import-dems.sql
\end{lstlisting}
+
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/import-dems.sql
--- a/flys-backend/doc/schema/import-dems.sql Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,24 +0,0 @@
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Elbe'), 'GRD_00000_01010', 0.0, 101.0, 2003, 2007, 'GK-3', 'DHHN92', 'ESRI-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Elbe/Geodaesie/Hoehenmodelle/m_00000_10110.grd');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Elbe'), 'GRD_00992_02030', 99.0, 203.0, 2003, 2007, 'GK-3', 'DHHN92', 'ESRI-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Elbe/Geodaesie/Hoehenmodelle/m_09920_20300.grd');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Elbe'), 'GRD_02020_02998', 202.0, 300.0, 2003, 2007, 'GK-3', 'DHHN92', 'ESRI-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Elbe/Geodaesie/Hoehenmodelle/m_20200_29980.grd');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Elbe'), 'GRD_02981_04010', 298.0, 401.0, 2003, 2007, 'GK-3', 'DHHN92', 'ESRI-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Elbe/Geodaesie/Hoehenmodelle/m_29810_40100.grd');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Elbe'), 'GRD_04000_05009', 400.0, 501.0, 2003, 2007, 'GK-3', 'DHHN92', 'ESRI-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Elbe/Geodaesie/Hoehenmodelle/m_40000_50090.grd');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Elbe'), 'GRD_05001_05830', 500.0, 583.0, 2003, 2007, 'GK-3', 'DHHN92', 'ESRI-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Elbe/Geodaesie/Hoehenmodelle/m_50010_58330.grd');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_00000_00058', 0.0, 6.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/0000-0580.xyz');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_00058_00153', 6.0, 15.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/0058-0153.xyz');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_00153_00416', 15.0, 42.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/0153-0416.xyz');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_00414_01012_O', 41.0, 101.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', 'muss überarbeitet werden', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/0414-1012O.xyz');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_00414_01012_W', 41.0, 101.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', 'muss überarbeitet werden', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/0414-1012W.xyz');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_01012_01488', 101.0, 145.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/1012-1488.xyz');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_01488_01666', 145.0, 167.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/1488-1666.xyz');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_01666_01960', 167.0, 196.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/1666-1960.xyz');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_01960_02044', 196.0, 204.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/1960-2044.XYZ');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_02044_02184', 204.0, 218.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/2044-2184.XYZ');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Mosel'), 'GRD_02184_02420', 218.0, 242.0, null, null, 'GK-2', 'DHHN85', 'ASCII-Grid', false, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Mosel/Geodaesie/Hoehenmodelle/DGMW-ASCII/525480MO.XYZ');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Saar'), 'GRD_00000_00079', 0.0, 8.0, 1999, 2002, 'GK-2', '', 'ASCII-Grid', true, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Saar/Geodaesie/Hoehenmodelle/km0000-0079_long.txt');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Saar'), 'GRD_00080_00204', 8.0, 20.0, 1999, 2002, 'GK-2', '', 'ASCII-Grid', true, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Saar/Geodaesie/Hoehenmodelle/km0080-0204_long.txt');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Saar'), 'GRD_00205_00314', 20.0, 31.0, 1999, 2002, 'GK-2', '', 'ASCII-Grid', true, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Saar/Geodaesie/Hoehenmodelle/km0205-0314_long.txt');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Saar'), 'GRD_00315_00541', 31.0, 54.0, 1999, 2002, 'GK-2', '', 'ASCII-Grid', true, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Saar/Geodaesie/Hoehenmodelle/km0315-0541_long.txt');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Saar'), 'GRD_00542_00655', 54.0, 65.0, 1999, 2002, 'GK-2', '', 'ASCII-Grid', true, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Saar/Geodaesie/Hoehenmodelle/km0542-0655_long.txt');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Saar'), 'GRD_00656_00828', 65.0, 83.0, 1999, 2002, 'GK-2', '', 'ASCII-Grid', true, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Saar/Geodaesie/Hoehenmodelle/km0656-0828_long.txt');
-INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,projection, elevation_state, format, border_break, resolution, description, path) VALUES ((SELECT id from rivers WHERE name = 'Saar'), 'GRD_00829_00931', 83.0, 93.0, 1999, 2002, 'GK-2', '', 'ASCII-Grid', true, '2', '', '/vol1/projects/Geospatial/flys-3.0/testdaten/Gewaesser/Saar/Geodaesie/Hoehenmodelle/km0829-0931_erweitert.txt');
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/oracle-drop-minfo.sql
--- a/flys-backend/doc/schema/oracle-drop-minfo.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/oracle-drop-minfo.sql Fri Mar 22 11:25:54 2013 +0100
@@ -20,12 +20,10 @@
ALTER TABLE morphologic_width DROP CONSTRAINT fk_mw_unit_id;
ALTER TABLE morphologic_width_values DROP CONSTRAINT fk_mwv_morphologic_width_id;
ALTER TABLE flow_velocity_model_values DROP CONSTRAINT fk_fvv_flow_velocity_model_id;
-ALTER TABLE flow_velocity_model DROP CONSTRAINT fk_fvm_river_id;
ALTER TABLE flow_velocity_model DROP CONSTRAINT fk_fvm_discharge_zone_id;
ALTER TABLE discharge_zone DROP CONSTRAINT fk_dz_river_id;
ALTER TABLE flow_velocity_measurements DROP CONSTRAINT fk_fvm_rivers_id;
ALTER TABLE flow_velocity_measure_values DROP CONSTRAINT fk_fvmv_measurements_id;
-ALTER TABLE grain_fraction DROP CONSTRAINT fk_gf_unit_id;
ALTER TABLE sediment_yield DROP CONSTRAINT fk_sy_grain_fraction_id;
ALTER TABLE sediment_yield DROP CONSTRAINT fk_sy_unit_id;
ALTER TABLE sediment_yield DROP CONSTRAINT fk_sy_time_interval_id;
@@ -42,6 +40,10 @@
ALTER TABLE sq_relation DROP CONSTRAINT fk_sqr_tinterval_id;
ALTER TABLE sq_relation DROP CONSTRAINT fk_sqr_river_id;
ALTER TABLE sq_relation_value DROP CONSTRAINT fk_sqr_id;
+ALTER TABLE measurement_station DROP CONSTRAINT fk_ms_river_id;
+ALTER TABLE measurement_station DROP CONSTRAINT fk_ms_range_id;
+ALTER TABLE measurement_station DROP CONSTRAINT fk_ms_reference_gauge_id;
+ALTER TABLE measurement_station DROP CONSTRAINT fk_ms_observation_timerange_id;
DROP TABLE bed_height_type;
DROP TABLE location_system;
@@ -69,6 +71,7 @@
DROP TABLE waterlevel_difference;
DROP TABLE waterlevel_difference_column;
DROP TABLE waterlevel_difference_values;
+DROP TABLE measurement_station;
DROP TABLE sq_relation_value;
DROP TABLE sq_relation;
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/oracle-drop-spatial.sql
--- a/flys-backend/doc/schema/oracle-drop-spatial.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/oracle-drop-spatial.sql Fri Mar 22 11:25:54 2013 +0100
@@ -13,11 +13,6 @@
DELETE FROM USER_SDO_GEOM_METADATA WHERE TABLE_NAME = 'CROSS_SECTION_TRACKS';
DROP SEQUENCE CROSS_SECTION_TRACKS_ID_SEQ;
-DROP TRIGGER lines_trigger;
-DROP TABLE lines;
-DELETE FROM USER_SDO_GEOM_METADATA WHERE TABLE_NAME = 'LINES';
-DROP SEQUENCE LINES_ID_SEQ;
-
DROP TRIGGER buildings_trigger;
DROP TABLE buildings;
DELETE FROM USER_SDO_GEOM_METADATA WHERE TABLE_NAME = 'BUILDINGS';
@@ -42,10 +37,15 @@
DELETE FROM USER_SDO_GEOM_METADATA WHERE TABLE_NAME = 'CATCHMENT';
DROP SEQUENCE CATCHMENT_ID_SEQ;
-DROP TRIGGER hws_trigger;
-DROP TABLE hws;
-DELETE FROM USER_SDO_GEOM_METADATA WHERE TABLE_NAME = 'HWS';
-DROP SEQUENCE HWS_ID_SEQ;
+DROP TRIGGER hws_lines_trigger;
+DROP TABLE hws_lines;
+DELETE FROM USER_SDO_GEOM_METADATA WHERE TABLE_NAME = 'HWS_LINES';
+DROP SEQUENCE HWS_LINES_ID_SEQ;
+
+DROP TRIGGER hws_points_trigger;
+DROP TABLE hws_points;
+DELETE FROM USER_SDO_GEOM_METADATA WHERE TABLE_NAME = 'HWS_POINTS';
+DROP SEQUENCE HWS_POINTS_ID_SEQ;
DROP TRIGGER floodmaps_trigger;
DROP TABLE floodmaps;
@@ -66,3 +66,12 @@
DROP TABLE gauge_location;
DELETE FROM USER_SDO_GEOM_METADATA WHERE TABLE_NAME = 'GAUGE_LOCATION';
DROP SEQUENCE GAUGE_LOCATION_ID_SEQ;
+
+DROP TABLE hws_kinds;
+DROP TABLE sectie_kinds;
+DROP TABLE sobek_kinds;
+DROP TABLE fed_states;
+DROP TABLE axis_kinds;
+DROP TABLE boundary_kinds;
+DROP TABLE cross_section_track_kinds;
+DROP TABLE floodplain_kinds;
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/oracle-drop.sql
--- a/flys-backend/doc/schema/oracle-drop.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/oracle-drop.sql Fri Mar 22 11:25:54 2013 +0100
@@ -29,6 +29,7 @@
ALTER TABLE wst_columns DROP CONSTRAINT cWstColumnsWsts;
ALTER TABLE wst_q_ranges DROP CONSTRAINT cWstQRangesRanges;
ALTER TABLE wsts DROP CONSTRAINT cWstsRivers;
+ALTER TABLE wsts DROP CONSTRAINT cWstsWstKinds;
DROP TABLE annotation_types;
DROP TABLE annotations;
DROP TABLE attributes;
@@ -57,6 +58,7 @@
DROP TABLE wst_columns;
DROP TABLE wst_q_ranges;
DROP TABLE wsts;
+DROP TABLE wst_kinds;
DROP SEQUENCE ANNOTATION_TYPES_ID_SEQ;
DROP SEQUENCE ANNOTATIONS_ID_SEQ;
DROP SEQUENCE ATTRIBUTES_ID_SEQ;
@@ -88,3 +90,7 @@
DROP VIEW wst_value_table;
DROP VIEW wst_w_values ;
DROP VIEW wst_q_values;
+DROP VIEW official_lines;
+DROP VIEW q_main_values;
+DROP VIEW official_q_values;
+DROP VIEW wst_ranges;
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/oracle-minfo.sql
--- a/flys-backend/doc/schema/oracle-minfo.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/oracle-minfo.sql Fri Mar 22 11:25:54 2013 +0100
@@ -20,15 +20,19 @@
CONSTRAINT fk_unit FOREIGN KEY (unit_id) REFERENCES units(id)
);
-CREATE SEQUENCE BED_HEIGHT_TYPE_SEQ;
+-- lookup table for bedheight types
CREATE TABLE bed_height_type (
id NUMBER(38,0) NOT NULL,
- name VARCHAR(16) NOT NULL,
- description VARCHAR(255),
+ name VARCHAR(65) NOT NULL,
PRIMARY KEY(id)
);
-
+INSERT INTO bed_height_type VALUES (1, 'Querprofile');
+INSERT INTO bed_height_type VALUES (2, 'Flächenpeilung');
+INSERT INTO bed_height_type VALUES (3, 'Flächen- u. Querprofilpeilungen');
+INSERT INTO bed_height_type VALUES (4, 'DGM');
+INSERT INTO bed_height_type VALUES (5, 'TIN');
+INSERT INTO bed_height_type VALUES (6, 'Modell');
CREATE SEQUENCE BED_HEIGHT_SINGLE_ID_SEQ;
@@ -188,11 +192,9 @@
CREATE TABLE flow_velocity_model (
id NUMBER(38,0) NOT NULL,
- river_id NUMBER(38,0) NOT NULL,
discharge_zone_id NUMBER(38,0) NOT NULL,
description VARCHAR(256),
PRIMARY KEY (id),
- CONSTRAINT fk_fvm_river_id FOREIGN KEY (river_id) REFERENCES rivers(id),
CONSTRAINT fk_fvm_discharge_zone_id FOREIGN KEY (discharge_zone_id) REFERENCES discharge_zone (id)
);
@@ -246,9 +248,7 @@
name VARCHAR(64) NOT NULL,
lower NUMBER(38,3),
upper NUMBER(38,3),
- unit_id NUMBER(38,0),
PRIMARY KEY (id),
- CONSTRAINT fk_gf_unit_id FOREIGN KEY (unit_id) REFERENCES units(id)
);
@@ -281,75 +281,25 @@
);
-CREATE SEQUENCE WATERLEVEL_ID_SEQ;
-
-CREATE TABLE waterlevel (
- id NUMBER(38,0) NOT NULL,
- river_id NUMBER(38,0) NOT NULL,
- unit_id NUMBER(38,0) NOT NULL,
- description VARCHAR(256),
- PRIMARY KEY (id),
- CONSTRAINT fk_w_river_id FOREIGN KEY (river_id) REFERENCES rivers(id),
- CONSTRAINT fk_w_unit_id FOREIGN KEY (unit_id) REFERENCES units(id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_Q_RANGES_ID_SEQ;
-
-CREATE TABLE waterlevel_q_range (
- id NUMBER(38,0) NOT NULL,
- waterlevel_id NUMBER(38,0) NOT NULL,
- q NUMBER(38,2) NOT NULL,
- PRIMARY KEY (id),
- CONSTRAINT fk_wqr_waterlevel_id FOREIGN KEY (waterlevel_id) REFERENCES waterlevel(id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_VALUES_ID_SEQ;
-
-CREATE TABLE waterlevel_values (
- id NUMBER(38,0) NOT NULL,
- waterlevel_q_range_id NUMBER(38,0) NOT NULL,
- station NUMBER(38,3) NOT NULL,
- w NUMBER(38,2) NOT NULL,
- PRIMARY KEY (id),
- CONSTRAINT fk_wv_waterlevel_q_range_id FOREIGN KEY (waterlevel_q_range_id) REFERENCES waterlevel_q_range(id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_DIFFERENCE_ID_SEQ;
-
-CREATE TABLE waterlevel_difference (
- id NUMBER(38,0) NOT NULL,
- river_id NUMBER(38,0) NOT NULL,
- unit_id NUMBER(38,0) NOT NULL,
- description VARCHAR(256),
- PRIMARY KEY (id),
- CONSTRAINT fk_wd_river_id FOREIGN KEY (river_id) REFERENCES rivers (id),
- CONSTRAINT fk_wd_unit_id FOREIGN KEY (unit_id) REFERENCES units(id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_DIFF_COLUMN_ID_SEQ;
-
-CREATE TABLE waterlevel_difference_column (
- id NUMBER(38,0) NOT NULL,
- difference_id NUMBER(38,0) NOT NULL,
- description VARCHAR(256),
- PRIMARY KEY (id),
- CONSTRAINT fk_wdc_difference_id FOREIGN KEY (difference_id) REFERENCES waterlevel_difference (id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_DIFF_VALUES_ID_SEQ;
-
-CREATE TABLE waterlevel_difference_values (
- id NUMBER(38,0) NOT NULL,
- column_id NUMBER(38,0) NOT NULL,
- station NUMBER(38,3) NOT NULL,
- value NUMBER(38,2) NOT NULL,
- PRIMARY KEY (id),
- CONSTRAINT fk_wdv_column_id FOREIGN KEY (column_id) REFERENCES waterlevel_difference_column (id)
+CREATE SEQUENCE MEASUREMENT_STATION_ID_SEQ;
+CREATE TABLE measurement_station (
+ id NUMBER(38) NOT NULL,
+ name VARCHAR(256) NOT NULL,
+ river_id NUMBER(38) NOT NULL,
+ station NUMBER(38,3) NOT NULL,
+ range_id NUMBER(38) NOT NULL,
+ measurement_type VARCHAR(64) NOT NULL,
+ riverside VARCHAR(16),
+ reference_gauge_id NUMBER(38),
+ observation_timerange_id NUMBER(38),
+ operator VARCHAR(64),
+ description VARCHAR(512),
+ PRIMARY KEY (id),
+ CONSTRAINT fk_ms_river_id FOREIGN KEY (river_id) REFERENCES rivers(id) ON DELETE CASCADE,
+ CONSTRAINT fk_ms_range_id FOREIGN KEY (range_id) REFERENCES ranges(id) ON DELETE CASCADE,
+ CONSTRAINT fk_ms_reference_gauge_id FOREIGN KEY (reference_gauge_id) REFERENCES gauges(id) ON DELETE CASCADE,
+ CONSTRAINT fk_ms_observation_timerange_id FOREIGN KEY (observation_timerange_id) REFERENCES time_intervals(id),
+ UNIQUE (river_id, station)
);
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/oracle-spatial-migrate-dami.sql
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/doc/schema/oracle-spatial-migrate-dami.sql Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,122 @@
+DROP TRIGGER hws_trigger;
+DROP TABLE hws;
+DROP SEQUENCE HWS_ID_SEQ;
+
+--Static lookup tables for Hochwasserschutzanlagen
+CREATE TABLE hws_kinds (
+ id NUMBER PRIMARY KEY NOT NULL,
+ kind VARCHAR(64) NOT NULL
+);
+INSERT INTO hws_kinds (id, kind) VALUES (1, 'Durchlass');
+INSERT INTO hws_kinds (id, kind) VALUES (2, 'Damm');
+INSERT INTO hws_kinds (id, kind) VALUES (3, 'Graben');
+
+CREATE TABLE fed_states (
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(23) NOT NULL
+);
+INSERT INTO fed_states (id, name) VALUES (1, 'Bayern');
+INSERT INTO fed_states (id, name) VALUES (2, 'Hessen');
+INSERT INTO fed_states (id, name) VALUES (3, 'Niedersachsen');
+INSERT INTO fed_states (id, name) VALUES (4, 'Nordrhein-Westfalen');
+INSERT INTO fed_states (id, name) VALUES (5, 'Rheinland-Pfalz');
+INSERT INTO fed_states (id, name) VALUES (6, 'Saarland');
+INSERT INTO fed_states (id, name) VALUES (7, 'Schleswig-Holstein');
+INSERT INTO fed_states (id, name) VALUES (8, 'Brandenburg');
+INSERT INTO fed_states (id, name) VALUES (9, 'Mecklenburg-Vorpommern');
+INSERT INTO fed_states (id, name) VALUES (10, 'Thüringen');
+INSERT INTO fed_states (id, name) VALUES (11, 'Baden-Württemberg');
+INSERT INTO fed_states (id, name) VALUES (12, 'Sachsen-Anhalt');
+INSERT INTO fed_states (id, name) VALUES (13, 'Sachsen');
+INSERT INTO fed_states (id, name) VALUES (14, 'Berlin');
+INSERT INTO fed_states (id, name) VALUES (15, 'Bremen');
+INSERT INTO fed_states (id, name) VALUES (16, 'Hamburg');
+
+-- HWS-Lines
+CREATE SEQUENCE HWS_LINES_ID_SEQ;
+CREATE TABLE hws_lines (
+ OGR_FID NUMBER(38),
+ GEOM MDSYS.SDO_GEOMETRY,
+ kind_id NUMBER(2) DEFAULT 2 REFERENCES hws_kinds(id),
+ fed_state_id NUMBER(2) REFERENCES fed_states(id),
+ river_id NUMBER(38) REFERENCES rivers(id),
+ name VARCHAR(256),
+ path VARCHAR(256),
+ official NUMBER DEFAULT 0,
+ agency VARCHAR(256),
+ range VARCHAR(256),
+ shore_side NUMBER DEFAULT 0,
+ source VARCHAR(256),
+ status_date TIMESTAMP,
+ description VARCHAR(256),
+ id NUMBER PRIMARY KEY NOT NULL
+);
+INSERT INTO USER_SDO_GEOM_METADATA VALUES ('hws_lines', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001),MDSYS.SDO_DIM_ELEMENT('Z',-100000,100000,0.002)), 31467);
+CREATE INDEX hws_lines_spatial_idx ON hws_lines(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=LINE');
+
+CREATE OR REPLACE TRIGGER hws_lines_trigger BEFORE INSERT ON hws_lines FOR each ROW
+ BEGIN
+ SELECT HWS_LINES_ID_SEQ.nextval INTO :new.id FROM dual;
+ END;
+
+-- HWS Points lookup tables
+CREATE TABLE sectie_kinds (
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO sectie_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO sectie_kinds (id, name) VALUES (1, 'Flussschlauch');
+INSERT INTO sectie_kinds (id, name) VALUES (2, 'Uferbank');
+INSERT INTO sectie_kinds (id, name) VALUES (3, 'Ãberflutungsbereich');
+
+CREATE TABLE sobek_kinds (
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO sobek_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO sobek_kinds (id, name) VALUES (1, 'Stromführend');
+INSERT INTO sobek_kinds (id, name) VALUES (2, 'Stromspeichernd');
+
+CREATE TABLE boundary_kinds (
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO boundary_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO boundary_kinds (id, name) VALUES (1, 'BfG');
+INSERT INTO boundary_kinds (id, name) VALUES (2, 'Land');
+INSERT INTO boundary_kinds (id, name) VALUES (3, 'Sonstige');
+
+-- HWS Points
+CREATE SEQUENCE HWS_POINTS_ID_SEQ;
+CREATE TABLE hws_points (
+ OGR_FID NUMBER(38),
+ GEOM MDSYS.SDO_GEOMETRY,
+ ogr_fid NUMBER,
+ kind_id NUMBER DEFAULT 2 REFERENCES hws_kinds(id),
+ fed_state_id NUMBER REFERENCES fed_states(id),
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
+ name VARCHAR(256),
+ path VARCHAR(256),
+ official NUMBER DEFAULT 0,
+ agency VARCHAR(256),
+ range VARCHAR(256),
+ shore_side NUMBER DEFAULT 0,
+ source VARCHAR(256),
+ status_date VARCHAR(256),
+ description VARCHAR(256),
+ freeboard NUMBER(19,5),
+ dike_km NUMBER(19,5),
+ z NUMBER(19,5),
+ z_target NUMBER(19,5),
+ rated_level NUMBER(19,5),
+ id NUMBER PRIMARY KEY NOT NULL
+);
+
+-- Altrications
+ALTER TABLE dem ADD srid NUMBER NOT NULL;
+ALTER TABLE hydr_boundaries_poly ADD sectie NUMBER REFERENCES sectie_kinds(id);
+ALTER TABLE hydr_boundaries_poly ADD sobek NUMBER REFERENCES sobek_kinds(id);
+ALTER TABLE hydr_boundaries ADD sectie NUMBER REFERENCES sectie_kinds(id);
+ALTER TABLE hydr_boundaries ADD sobek NUMBER REFERENCES sobek_kinds(id);
+ALTER TABLE hydr_boundaries ADD kind NUMBER REFERENCES boundary_kinds(id);
+ALTER TABLE hydr_boundaries_poly ADD kind NUMBER REFERENCES boundary_kinds(id);
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/oracle-spatial.sql
--- a/flys-backend/doc/schema/oracle-spatial.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/oracle-spatial.sql Fri Mar 22 11:25:54 2013 +0100
@@ -1,10 +1,20 @@
+WHENEVER SQLERROR EXIT;
+
+CREATE TABLE axis_kinds(
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(64)
+);
+INSERT INTO axis_kinds(id, name) VALUES (0, 'Unbekannt');
+INSERT INTO axis_kinds(id, name) VALUES (1, 'Aktuell');
+INSERT INTO axis_kinds(id, name) VALUES (2, 'Sonstige');
+
-- Geodaesie/Flussachse+km/achse
CREATE SEQUENCE RIVER_AXES_ID_SEQ;
CREATE TABLE river_axes(
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
- kind NUMBER(38) DEFAULT 0 NOT NULL,
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
+ kind_id NUMBER(38) REFERENCES axis_kinds(id) NOT NULL DEFAULT 0,
name VARCHAR(64),
path VARCHAR(256),
ID NUMBER PRIMARY KEY NOT NULL
@@ -23,13 +33,13 @@
CREATE TABLE river_axes_km(
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
- km NUMBER(6,3),
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
+ km NUMBER(7,3),
name VARCHAR(64),
path VARCHAR(256),
ID NUMBER PRIMARY KEY NOT NULL
);
-INSERT INTO USER_SDO_GEOM_METADATA VALUES ('river_axes_km', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001),MDSYS.SDO_DIM_ELEMENT('Z',-100000,100000,0.002)), 31467);
+INSERT INTO USER_SDO_GEOM_METADATA VALUES ('river_axes_km', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001)), 31467);
CREATE OR REPLACE TRIGGER river_axes_km_trigger BEFORE INSERT ON river_axes_km FOR each ROW
BEGIN
SELECT river_axes_km_ID_SEQ.nextval INTO :new.id FROM dual;
@@ -39,11 +49,19 @@
--Geodaesie/Querprofile/QP-Spuren/qps.shp
+CREATE TABLE cross_section_track_kinds(
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(64)
+);
+INSERT INTO cross_section_track_kinds(id, name) VALUES (0, 'Sonstige');
+INSERT INTO cross_section_track_kinds(id, name) VALUES (1, 'Aktuell');
+
CREATE SEQUENCE CROSS_SECTION_TRACKS_ID_SEQ;
CREATE TABLE cross_section_tracks (
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
+ kind_id NUMBER(38) REFERENCES cross_section_track_kinds(id) NOT NULL DEFAULT 0,
km NUMBER(38,12) NOT NULL,
z NUMBER(38,12) DEFAULT 0 NOT NULL,
name VARCHAR(64),
@@ -59,39 +77,12 @@
--CREATE INDEX CrossSectionTracks_spatial_idx ON cross_section_tracks(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=LINE');
--- TODO: TestMe. Fix Importer-Script. Fix oracle_spatial_idx.sql script.
--- Geodaesie/Linien/rohre-und-speeren
-CREATE SEQUENCE LINES_ID_SEQ;
-CREATE TABLE lines (
- OGR_FID NUMBER(38),
- GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
- kind VARCHAR2(16) NOT NULL,
- z NUMBER(38,12) DEFAULT 0,
- name VARCHAR(64),
- path VARCHAR(256),
- ID NUMBER PRIMARY KEY NOT NULL
-);
-INSERT INTO USER_SDO_GEOM_METADATA VALUES ('lines', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001),MDSYS.SDO_DIM_ELEMENT('Z',-100000,100000,0.002)), 31467);
-CREATE OR REPLACE TRIGGER lines_trigger BEFORE INSERT ON lines FOR each ROW
- BEGIN
- SELECT LINES_ID_SEQ.nextval INTO :new.id FROM dual;
- END;
-/
--- NOTE: Should lines should be 3D.
--- TODO: Test index.
---CREATE INDEX lines_idx ON lines(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=LINE');
--- 'kind':
--- 0: ROHR1
--- 1: DAMM
-
-
-- Geodaesie/Bauwerke/Wehre.shp
CREATE SEQUENCE BUILDINGS_ID_SEQ;
CREATE TABLE buildings(
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
name VARCHAR2(255),
path VARCHAR(256),
ID NUMBER PRIMARY KEY NOT NULL
@@ -110,7 +101,7 @@
CREATE TABLE fixpoints (
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
x NUMBER(38,11),
y NUMBER(38,11),
km NUMBER(38,11) NOT NULL,
@@ -119,7 +110,7 @@
path VARCHAR(256),
ID NUMBER PRIMARY KEY NOT NULL
);
-INSERT INTO USER_SDO_GEOM_METADATA VALUES ('fixpoints', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001),MDSYS.SDO_DIM_ELEMENT('Z',-100000,100000,0.002)), 31467);
+INSERT INTO USER_SDO_GEOM_METADATA VALUES ('fixpoints', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001)), 31467);
CREATE OR REPLACE TRIGGER fixpoints_trigger BEFORE INSERT ON fixpoints FOR each ROW
BEGIN
SELECT FIXPOINTS_ID_SEQ.nextval INTO :new.id FROM dual;
@@ -129,11 +120,19 @@
-- Hydrologie/Hydr. Grenzen/talaue.shp
+CREATE TABLE floodplain_kinds(
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(64)
+);
+INSERT INTO floodplain_kinds(id, name) VALUES (0, 'Sonstige');
+INSERT INTO floodplain_kinds(id, name) VALUES (1, 'Aktuell');
+
CREATE SEQUENCE FLOODPLAIN_ID_SEQ;
CREATE TABLE floodplain(
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
+ kind_id NUMBER(38) REFERENCES floodplain_kinds(id) NOT NULL DEFAULT 0,
name VARCHAR(64),
path VARCHAR(256),
ID NUMBER PRIMARY KEY NOT NULL
@@ -147,26 +146,22 @@
--CREATE INDEX floodplain_spatial_idx ON floodplain(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=POLYGON');
--- TODO: Test-Me. Fix Importer-Skript.
--- NOTE: It's not a spatial schema!
-- Geodaesie/Hoehenmodelle/*
CREATE SEQUENCE DEM_ID_SEQ;
CREATE TABLE dem (
- ID NUMBER PRIMARY KEY NOT NULL,
- river_id NUMBER(38),
- -- XXX Should we use the ranges table instead?
- name VARCHAR(64),
- lower NUMBER(19,5),
- upper NUMBER(19,5),
- year_from VARCHAR(32) NOT NULL,
- year_to VARCHAR(32) NOT NULL,
- projection VARCHAR(32) NOT NULL,
- elevation_state VARCHAR(32),
- format VARCHAR(32),
- border_break BOOLEAN NOT NULL DEFAULT FALSE,
- resolution VARCHAR(16),
- description VARCHAR(256),
- path VARCHAR(256)
+ ID NUMBER PRIMARY KEY NOT NULL,
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
+ name VARCHAR(64),
+ range_id NUMBER(38) REFERENCES ranges(id),
+ time_interval_id NUMBER(38) REFERENCES time_intervals(id),
+ projection VARCHAR(32),
+ elevation_state VARCHAR(32),
+ srid NUMBER NOT NULL,
+ format VARCHAR(32),
+ border_break NUMBER(1) DEFAULT 0 NOT NULL,
+ resolution VARCHAR(16),
+ description VARCHAR(256),
+ path VARCHAR(256) NOT NULL
);
CREATE OR REPLACE TRIGGER dem_trigger BEFORE INSERT ON dem FOR each ROW
BEGIN
@@ -174,61 +169,146 @@
END;
/
+--Static lookup tables for Hochwasserschutzanlagen
+CREATE TABLE hws_kinds (
+ id NUMBER PRIMARY KEY NOT NULL,
+ kind VARCHAR(64) NOT NULL
+);
+INSERT INTO hws_kinds (id, kind) VALUES (1, 'Durchlass');
+INSERT INTO hws_kinds (id, kind) VALUES (2, 'Damm');
+INSERT INTO hws_kinds (id, kind) VALUES (3, 'Graben');
--- Hydrologie/Einzugsgebiete/EZG.shp
-CREATE SEQUENCE CATCHMENT_ID_SEQ;
-CREATE TABLE catchment(
+CREATE TABLE fed_states (
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(23) NOT NULL
+);
+INSERT INTO fed_states (id, name) VALUES (1, 'Bayern');
+INSERT INTO fed_states (id, name) VALUES (2, 'Hessen');
+INSERT INTO fed_states (id, name) VALUES (3, 'Niedersachsen');
+INSERT INTO fed_states (id, name) VALUES (4, 'Nordrhein-Westfalen');
+INSERT INTO fed_states (id, name) VALUES (5, 'Rheinland-Pfalz');
+INSERT INTO fed_states (id, name) VALUES (6, 'Saarland');
+INSERT INTO fed_states (id, name) VALUES (7, 'Schleswig-Holstein');
+INSERT INTO fed_states (id, name) VALUES (8, 'Brandenburg');
+INSERT INTO fed_states (id, name) VALUES (9, 'Mecklenburg-Vorpommern');
+INSERT INTO fed_states (id, name) VALUES (10, 'Thüringen');
+INSERT INTO fed_states (id, name) VALUES (11, 'Baden-Württemberg');
+INSERT INTO fed_states (id, name) VALUES (12, 'Sachsen-Anhalt');
+INSERT INTO fed_states (id, name) VALUES (13, 'Sachsen');
+INSERT INTO fed_states (id, name) VALUES (14, 'Berlin');
+INSERT INTO fed_states (id, name) VALUES (15, 'Bremen');
+INSERT INTO fed_states (id, name) VALUES (16, 'Hamburg');
+
+--Hydrologie/HW-Schutzanlagen/hws.shp
+-- HWS-Lines
+CREATE SEQUENCE HWS_LINES_ID_SEQ;
+CREATE TABLE hws_lines (
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
- area NUMBER(19,5),
- name VARCHAR2(255),
- path VARCHAR(256),
- ID NUMBER PRIMARY KEY NOT NULL
+ kind_id NUMBER(2) DEFAULT 2 REFERENCES hws_kinds(id),
+ fed_state_id NUMBER(2) REFERENCES fed_states(id),
+ river_id NUMBER(38) REFERENCES rivers(id),
+ name VARCHAR(256),
+ path VARCHAR(256),
+ official NUMBER DEFAULT 0,
+ agency VARCHAR(256),
+ range VARCHAR(256),
+ shore_side NUMBER DEFAULT 0,
+ source VARCHAR(256),
+ status_date TIMESTAMP,
+ description VARCHAR(256),
+ id NUMBER PRIMARY KEY NOT NULL
);
-INSERT INTO USER_SDO_GEOM_METADATA VALUES ('CATCHMENT', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001),MDSYS.SDO_DIM_ELEMENT('Z',-100000,100000,0.002)), 31467);
-
-CREATE TRIGGER catchment_trigger BEFORE INSERT ON catchment FOR each ROW
+INSERT INTO USER_SDO_GEOM_METADATA VALUES ('hws_lines', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001),MDSYS.SDO_DIM_ELEMENT('Z',-100000,100000,0.002)), 31467);
+CREATE OR REPLACE TRIGGER hws_lines_trigger BEFORE INSERT ON hws_lines FOR each ROW
BEGIN
- SELECT CATCHMENT_ID_SEQ.nextval INTO :new.id FROM dual;
+ SELECT HWS_LINES_ID_SEQ.nextval INTO :new.id FROM dual;
END;
/
---CREATE INDEX catchment_spatial_idx ON catchment(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=polygon');
+-- HWS Points lookup tables
+CREATE TABLE sectie_kinds (
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO sectie_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO sectie_kinds (id, name) VALUES (1, 'Flussschlauch');
+INSERT INTO sectie_kinds (id, name) VALUES (2, 'Uferbank');
+INSERT INTO sectie_kinds (id, name) VALUES (3, 'Ãberflutungsbereich');
---Hydrologie/HW-Schutzanlagen/hws.shp
-CREATE SEQUENCE HWS_ID_SEQ;
-CREATE TABLE hws(
+CREATE TABLE sobek_kinds (
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO sobek_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO sobek_kinds (id, name) VALUES (1, 'Stromführend');
+INSERT INTO sobek_kinds (id, name) VALUES (2, 'Stromspeichernd');
+
+CREATE TABLE boundary_kinds (
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO boundary_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO boundary_kinds (id, name) VALUES (1, 'BfG');
+INSERT INTO boundary_kinds (id, name) VALUES (2, 'Land');
+INSERT INTO boundary_kinds (id, name) VALUES (3, 'Sonstige');
+
+-- HWS Points
+CREATE SEQUENCE HWS_POINTS_ID_SEQ;
+CREATE TABLE hws_points (
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
- hws_facility VARCHAR2(255),
- type VARCHAR2(255),
- name VARCHAR(64),
- path VARCHAR(256),
- ID NUMBER PRIMARY KEY NOT NULL
+ kind_id NUMBER DEFAULT 2 REFERENCES hws_kinds(id),
+ fed_state_id NUMBER REFERENCES fed_states(id),
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
+ name VARCHAR(256),
+ path VARCHAR(256),
+ official NUMBER DEFAULT 0,
+ agency VARCHAR(256),
+ range VARCHAR(256),
+ shore_side NUMBER DEFAULT 0,
+ source VARCHAR(256),
+ status_date VARCHAR(256),
+ description VARCHAR(256),
+ freeboard NUMBER(19,5),
+ dike_km NUMBER(19,5),
+ z NUMBER(19,5),
+ z_target NUMBER(19,5),
+ rated_level NUMBER(19,5),
+ id NUMBER PRIMARY KEY NOT NULL
);
-INSERT INTO USER_SDO_GEOM_METADATA VALUES ('hws', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001),MDSYS.SDO_DIM_ELEMENT('Z',-100000,100000,0.002)), 31467);
-CREATE OR REPLACE TRIGGER hws_trigger BEFORE INSERT ON hws FOR each ROW
+
+INSERT INTO USER_SDO_GEOM_METADATA VALUES ('hws_points', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001),MDSYS.SDO_DIM_ELEMENT('Z',-100000,100000,0.002)), 31467);
+
+CREATE OR REPLACE TRIGGER hws_points_trigger BEFORE INSERT ON hws_points FOR each ROW
BEGIN
- SELECT HWS_ID_SEQ.nextval INTO :new.id FROM dual;
+ SELECT HWS_POINTS_ID_SEQ.nextval INTO :new.id FROM dual;
END;
/
---CREATE INDEX hws_spatial_idx ON hws(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=LINE');
-
--Hydrologie/UeSG
+CREATE TABLE floodmap_kinds (
+ id NUMBER PRIMARY KEY NOT NULL,
+ name varchar(64) NOT NULL
+);
+INSERT INTO floodmap_kinds VALUES (200, 'Messung');
+INSERT INTO floodmap_kinds VALUES (111, 'Berechnung-Aktuell-BfG');
+INSERT INTO floodmap_kinds VALUES (112, 'Berechnung-Aktuell-Bundesländer');
+INSERT INTO floodmap_kinds VALUES (121, 'Berechnung-Potenziell-BfG');
+INSERT INTO floodmap_kinds VALUES (122, 'Berechnung-Potenziell-Bundesländer');
+
CREATE SEQUENCE FLOODMAPS_ID_SEQ;
CREATE TABLE floodmaps (
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
name VARCHAR(255),
- kind NUMBER(38),
+ kind NUMBER NOT NULL REFERENCES floodmap_kinds(id),
diff NUMBER(19,5),
count NUMBER(38),
area NUMBER(19,5),
perimeter NUMBER(19,5),
path VARCHAR(256),
+ source varchar(64),
id NUMBER PRIMARY KEY NOT NULL
);
INSERT INTO USER_SDO_GEOM_METADATA VALUES ('floodmaps', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001),MDSYS.SDO_DIM_ELEMENT('Z',-100000,100000,0.002)), 31467);
@@ -237,17 +317,17 @@
SELECT FLOODMAPS_ID_SEQ.nextval INTO :new.id FROM dual;
END;
/
-CREATE INDEX floodmaps_spatial_idx ON floodmaps(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=MULTIPOLYGON');
-
--Hydrologie/Hydr.Grenzen/Linien
CREATE SEQUENCE HYDR_BOUNDARIES_ID_SEQ;
CREATE TABLE hydr_boundaries (
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
name VARCHAR(255),
- kind NUMBER(38),
+ kind NUMBER(38) REFERENCES boundary_kinds(id),
+ sectie NUMBER(38) REFERENCES sectie_kinds(id),
+ sobek NUMBER(38) REFERENCES sobek_kinds(id),
path VARCHAR(256),
id NUMBER PRIMARY KEY NOT NULL
);
@@ -257,15 +337,16 @@
SELECT HYDR_BOUNDARIES_ID_SEQ.nextval INTO :new.id FROM dual;
END;
/
-CREATE INDEX hydr_boundaries_idx ON hydr_boundaries(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=LINE');
CREATE SEQUENCE HYDR_BOUNDARIES_POLY_ID_SEQ;
CREATE TABLE hydr_boundaries_poly (
OGR_FID NUMBER(38),
GEOM MDSYS.SDO_GEOMETRY,
- river_id NUMBER(38),
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
name VARCHAR(255),
- kind NUMBER(38),
+ kind NUMBER(38) REFERENCES boundary_kinds(id),
+ sectie NUMBER(38) REFERENCES sectie_kinds(id),
+ sobek NUMBER(38) REFERENCES sobek_kinds(id),
path VARCHAR(256),
id NUMBER PRIMARY KEY NOT NULL
);
@@ -275,8 +356,6 @@
SELECT HYDR_BOUNDARIES_POLY_ID_SEQ.nextval INTO :new.id FROM dual;
END;
/
-CREATE INDEX hydr_boundaries_poly_idx ON hydr_boundaries_poly(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=MULTIPOLYGON');
-
-- Hydrologie/Streckendaten/
CREATE SEQUENCE GAUGE_LOCATION_ID_SEQ;
@@ -294,4 +373,30 @@
SELECT GAUGE_LOCATION_ID_SEQ.nextval INTO :new.id FROM dual;
END;
/
-CREATE INDEX gauge_location_idx ON gauge_location(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=POINT');
+
+
+CREATE TABLE jetty_kinds(
+ id NUMBER PRIMARY KEY NOT NULL,
+ name VARCHAR(64)
+);
+INSERT INTO jetty_kinds VALUES (0, 'Buhnenkopf');
+INSERT INTO jetty_kinds VALUES (1, 'BuhnenfuÃ');
+INSERT INTO jetty_kinds VALUES (2, 'Buhnenwurzel');
+
+CREATE SEQUENCE JETTIES_ID_SEQ;
+CREATE TABLE jetties (
+ OGR_FID NUMBER(38),
+ GEOM MDSYS.SDO_GEOMETRY,
+ id NUMBER PRIMARY KEY NOT NULL,
+ river_id NUMBER(38) REFERENCES rivers(id) ON DELETE CASCADE,
+ path VARCHAR(256),
+ kind_id NUMBER(38) REFERENCES jetty_kinds(id),
+ km NUMBER(7,3),
+ z NUMBER(38,12)
+);
+INSERT INTO USER_SDO_GEOM_METADATA VALUES ('jetties', 'GEOM', MDSYS.SDO_DIM_ARRAY(MDSYS.SDO_DIM_ELEMENT('X',3282450,3912240,0.001),MDSYS.SDO_DIM_ELEMENT('Y',5248260,6100130,0.001)), 31467);
+CREATE OR REPLACE TRIGGER jetties_trigger BEFORE INSERT ON jetties FOR EACH ROW
+ BEGIN
+ SELECT JETTIES_ID_SEQ.nextval INTO :new.id FROM dual;
+ END;
+/
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/oracle-spatial_idx.sql
--- a/flys-backend/doc/schema/oracle-spatial_idx.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/oracle-spatial_idx.sql Fri Mar 22 11:25:54 2013 +0100
@@ -1,9 +1,28 @@
-CREATE INDEX catchment_spatial_idx ON catchment(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=multipolygon');
-CREATE INDEX river_axes_km_spatial_idx ON river_axes_km(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=point');
+-- TODO: index prevents `DELETE FROM rivers' on 11g
+-- Error: "Ebenendimensionalitat stimmt nicht mit Geometrie-Dimensionen uberein"
+-- CREATE INDEX river_axes_km_spatial_idx ON river_axes_km(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=point');
+
CREATE INDEX buildings_spatial_idx ON buildings(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=LINE');
-CREATE INDEX fixpoints_spatial_idx ON fixpoints(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=POINT');
-CREATE INDEX river_axes_spatial_idx ON river_axes(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=LINE');
+
+-- TODO: index prevents `DELETE FROM rivers' on 11g
+-- Error: "Ebenendimensionalitat stimmt nicht mit Geometrie-Dimensionen uberein"
+-- CREATE INDEX fixpoints_spatial_idx ON fixpoints(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=POINT');
+
+CREATE INDEX river_axes_spatial_idx ON river_axes(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=MULTILINE');
+
CREATE INDEX CrossSectionTracks_spatial_idx ON cross_section_tracks(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=LINE');
-CREATE INDEX hws_spatial_idx ON hws(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=LINE');
+
CREATE INDEX floodplain_spatial_idx ON floodplain(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=POLYGON');
-CREATE INDEX lines_idx ON lines(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=LINE');
+
+CREATE INDEX hydr_boundaries_idx ON hydr_boundaries(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=MULTILINE');
+
+CREATE INDEX hws_points_spatial_idx ON hws_points(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=POINT');
+CREATE INDEX hws_lines_spatial_idx ON hws_lines(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=MULTILINE');
+
+CREATE INDEX floodmaps_spatial_idx ON floodmaps(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=MULTIPOLYGON');
+
+CREATE INDEX gauge_location_idx ON gauge_location(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=POINT');
+CREATE INDEX hydr_boundaries_poly_idx ON hydr_boundaries_poly(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=MULTIPOLYGON');
+
+CREATE INDEX jetties_idx ON jetties(GEOM) indextype IS MDSYS.SPATIAL_INDEX parameters ('LAYER_GTYPE=POINT');
+
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/oracle.sql
--- a/flys-backend/doc/schema/oracle.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/oracle.sql Fri Mar 22 11:25:54 2013 +0100
@@ -115,14 +115,17 @@
CREATE TABLE gauges (
id NUMBER(38,0) NOT NULL,
- aeo NUMBER(38,2),
- datum NUMBER(38,2),
- name VARCHAR2(255),
- station NUMBER(38,2),
- official_number NUMBER(38,0),
- range_id NUMBER(38,0),
+ aeo NUMBER(38,2) NOT NULL,
+ datum NUMBER(38,2) NOT NULL,
+ name VARCHAR2(255) NOT NULL,
+ station NUMBER(38,2) NOT NULL,
+ official_number NUMBER(38,0) UNIQUE,
+ range_id NUMBER(38,0) NOT NULL,
+ -- remove river id here because range_id references river already
river_id NUMBER(38,0),
- PRIMARY KEY (id)
+ PRIMARY KEY (id),
+ UNIQUE (name, river_id),
+ UNIQUE (river_id, station)
);
@@ -238,13 +241,14 @@
CREATE TABLE ranges (
id NUMBER(38,0) NOT NULL,
- a NUMBER(38,10),
+ a NUMBER(38,10) NOT NULL,
b NUMBER(38,10),
river_id NUMBER(38,0),
PRIMARY KEY (id)
);
+
-- RIVERS
CREATE SEQUENCE RIVERS_ID_SEQ;
@@ -263,9 +267,10 @@
CREATE TABLE time_intervals (
id NUMBER(38,0) NOT NULL,
- start_time TIMESTAMP,
+ start_time TIMESTAMP NOT NULL,
stop_time TIMESTAMP,
- PRIMARY KEY (id)
+ PRIMARY KEY (id),
+ CHECK (start_time <= stop_time)
);
@@ -328,6 +333,21 @@
-- WSTS
+--lookup table for wst kinds
+CREATE TABLE wst_kinds (
+ id NUMBER PRIMARY KEY NOT NULL,
+ kind VARCHAR(64) NOT NULL
+);
+INSERT INTO wst_kinds (id, kind) VALUES (0, 'basedata');
+INSERT INTO wst_kinds (id, kind) VALUES (1, 'basedata_additionals_marks');
+INSERT INTO wst_kinds (id, kind) VALUES (2, 'basedata_fixations_wst');
+INSERT INTO wst_kinds (id, kind) VALUES (3, 'basedata_officials');
+INSERT INTO wst_kinds (id, kind) VALUES (4, 'basedata_heightmarks-points-relative_points');
+INSERT INTO wst_kinds (id, kind) VALUES (5, 'basedata_flood-protections_relative_points');
+INSERT INTO wst_kinds (id, kind) VALUES (6, 'morpho_waterlevel-differences');
+INSERT INTO wst_kinds (id, kind) VALUES (7, 'morpho_waterlevels');
+
+
CREATE SEQUENCE WSTS_ID_SEQ;
CREATE TABLE wsts (
@@ -340,38 +360,41 @@
-- ADD CONSTRAINTs
-ALTER TABLE annotations ADD CONSTRAINT cAnnotationsRanges FOREIGN KEY (range_id) REFERENCES ranges;
+ALTER TABLE annotations ADD CONSTRAINT cAnnotationsAttributes FOREIGN KEY (attribute_id) REFERENCES attributes;
ALTER TABLE annotations ADD CONSTRAINT cAnnotationsEdges FOREIGN KEY (edge_id) REFERENCES edges;
ALTER TABLE annotations ADD CONSTRAINT cAnnotationsPositions FOREIGN KEY (position_id) REFERENCES positions;
-ALTER TABLE annotations ADD CONSTRAINT cAnnotationsAttributes FOREIGN KEY (attribute_id) REFERENCES attributes;
ALTER TABLE annotations ADD CONSTRAINT cAnnotationsTypes FOREIGN KEY (type_id) REFERENCES annotation_types;
-ALTER TABLE cross_section_lines ADD CONSTRAINT cQPSLinesCrossSections FOREIGN KEY (cross_section_id) REFERENCES cross_sections;
-ALTER TABLE cross_section_points ADD CONSTRAINT cQPSPointsCrossSectionLines FOREIGN KEY (cross_section_line_id) REFERENCES cross_section_lines;
-ALTER TABLE cross_sections ADD CONSTRAINT cCrossSectionsRivers FOREIGN KEY (river_id) REFERENCES rivers;
ALTER TABLE cross_sections ADD CONSTRAINT cCrossSectionsTimeIntervals FOREIGN KEY (time_interval_id) REFERENCES time_intervals;
-ALTER TABLE discharge_table_values ADD CONSTRAINT cTableValuesDischargeTables foreign key (table_id) REFERENCES discharge_tables;
ALTER TABLE discharge_tables ADD CONSTRAINT cDischargeTablesTime_intervals FOREIGN KEY (time_interval_id) REFERENCES time_intervals;
-ALTER TABLE discharge_tables ADD CONSTRAINT cDischargeTablesGauges FOREIGN KEY (gauge_id) REFERENCES gauges;
-ALTER TABLE gauges ADD CONSTRAINT cGaugesRivers FOREIGN KEY (river_id) REFERENCES rivers;
-ALTER TABLE gauges ADD CONSTRAINT cGaugesRanges FOREIGN KEY (range_id) REFERENCES ranges;
-ALTER TABLE hyk_entries ADD CONSTRAINT cHykEntriesHyks FOREIGN KEY (hyk_id) REFERENCES hyks;
-ALTER TABLE hyk_flow_zones ADD CONSTRAINT cHykFlowZonesHykFormations FOREIGN KEY (formation_id) REFERENCES hyk_formations;
ALTER TABLE hyk_flow_zones ADD CONSTRAINT cHykFlowZonesHykFlowZoneTypes FOREIGN KEY (type_id) REFERENCES hyk_flow_zone_types;
-ALTER TABLE hyks ADD CONSTRAINT cHyksRivers FOREIGN KEY (river_id) REFERENCES rivers;
-ALTER TABLE hyk_formations ADD CONSTRAINT cHykFormationsHykEntries FOREIGN KEY (hyk_entry_id) REFERENCES hyk_entries;
+ALTER TABLE main_values ADD CONSTRAINT cMainValuesNamedMainValues FOREIGN KEY (named_value_id) REFERENCES named_main_values;
ALTER TABLE main_values ADD CONSTRAINT cMainValuesTimeIntervals FOREIGN KEY (time_interval_id) REFERENCES time_intervals;
-ALTER TABLE main_values ADD CONSTRAINT cMainValuesGauges FOREIGN KEY (gauge_id) REFERENCES gauges;
-ALTER TABLE main_values ADD CONSTRAINT cMainValuesNamedMainValues FOREIGN KEY (named_value_id) REFERENCES named_main_values;
ALTER TABLE named_main_values ADD CONSTRAINT cNamedMainValuesMainValueTypes FOREIGN KEY (type_id) REFERENCES main_value_types;
-ALTER TABLE ranges ADD CONSTRAINT cRangesRivers FOREIGN KEY (river_id) REFERENCES rivers;
ALTER TABLE rivers ADD CONSTRAINT cRiversUnits FOREIGN KEY (wst_unit_id) REFERENCES units;
-ALTER TABLE wst_column_q_ranges ADD CONSTRAINT cWstColumnQRangesWstColums FOREIGN KEY (wst_column_id) REFERENCES wst_columns;
-ALTER TABLE wst_column_q_ranges ADD CONSTRAINT cWstColumnQRangesWstQRanges FOREIGN KEY (wst_q_range_id) REFERENCES wst_q_ranges;
-ALTER TABLE wst_column_values ADD CONSTRAINT cWstColumnValuesWstColumns FOREIGN KEY (wst_column_id) REFERENCES wst_columns;
ALTER TABLE wst_columns ADD CONSTRAINT cWstColumnsTime_intervals FOREIGN KEY (time_interval_id) REFERENCES time_intervals;
-ALTER TABLE wst_columns ADD CONSTRAINT cWstColumnsWsts FOREIGN KEY (wst_id) REFERENCES wsts;
-ALTER TABLE wst_q_ranges ADD CONSTRAINT cWstQRangesRanges FOREIGN KEY (range_id) REFERENCES RANGES;
-ALTER TABLE wsts ADD CONSTRAINT cWstsRivers FOREIGN KEY (river_id) REFERENCES rivers;
+
+-- Cascading references
+ALTER TABLE annotations ADD CONSTRAINT cAnnotationsRanges FOREIGN KEY (range_id) REFERENCES ranges ON DELETE CASCADE;
+ALTER TABLE cross_section_lines ADD CONSTRAINT cQPSLinesCrossSections FOREIGN KEY (cross_section_id) REFERENCES cross_sections ON DELETE CASCADE;
+ALTER TABLE cross_section_points ADD CONSTRAINT cQPSPointsCrossSectionLines FOREIGN KEY (cross_section_line_id) REFERENCES cross_section_lines ON DELETE CASCADE;
+ALTER TABLE cross_sections ADD CONSTRAINT cCrossSectionsRivers FOREIGN KEY (river_id) REFERENCES rivers ON DELETE CASCADE;
+ALTER TABLE discharge_tables ADD CONSTRAINT cDischargeTablesGauges FOREIGN KEY (gauge_id) REFERENCES gauges ON DELETE CASCADE;
+ALTER TABLE discharge_table_values ADD CONSTRAINT cTableValuesDischargeTables FOREIGN KEY (table_id) REFERENCES discharge_tables ON DELETE CASCADE;
+ALTER TABLE gauges ADD CONSTRAINT cGaugesRanges FOREIGN KEY (range_id) REFERENCES ranges ON DELETE CASCADE;
+ALTER TABLE gauges ADD CONSTRAINT cGaugesRivers FOREIGN KEY (river_id) REFERENCES rivers ON DELETE CASCADE;
+ALTER TABLE hyk_entries ADD CONSTRAINT cHykEntriesHyks FOREIGN KEY (hyk_id) REFERENCES hyks ON DELETE CASCADE;
+ALTER TABLE hyk_flow_zones ADD CONSTRAINT cHykFlowZonesHykFormations FOREIGN KEY (formation_id) REFERENCES hyk_formations ON DELETE CASCADE;
+ALTER TABLE hyk_formations ADD CONSTRAINT cHykFormationsHykEntries FOREIGN KEY (hyk_entry_id) REFERENCES hyk_entries ON DELETE CASCADE;
+ALTER TABLE hyks ADD CONSTRAINT cHyksRivers FOREIGN KEY (river_id) REFERENCES rivers ON DELETE CASCADE;
+ALTER TABLE main_values ADD CONSTRAINT cMainValuesGauges FOREIGN KEY (gauge_id) REFERENCES gauges ON DELETE CASCADE;
+ALTER TABLE ranges ADD CONSTRAINT cRangesRivers FOREIGN KEY (river_id) REFERENCES rivers ON DELETE CASCADE;
+ALTER TABLE wst_column_q_ranges ADD CONSTRAINT cWstColumnQRangesWstColums FOREIGN KEY (wst_column_id) REFERENCES wst_columns ON DELETE CASCADE;
+ALTER TABLE wst_column_q_ranges ADD CONSTRAINT cWstColumnQRangesWstQRanges FOREIGN KEY (wst_q_range_id) REFERENCES wst_q_ranges ON DELETE CASCADE;
+ALTER TABLE wst_columns ADD CONSTRAINT cWstColumnsWsts FOREIGN KEY (wst_id) REFERENCES wsts ON DELETE CASCADE;
+ALTER TABLE wst_column_values ADD CONSTRAINT cWstColumnValuesWstColumns FOREIGN KEY (wst_column_id) REFERENCES wst_columns ON DELETE CASCADE;
+ALTER TABLE wst_q_ranges ADD CONSTRAINT cWstQRangesRanges FOREIGN KEY (range_id) REFERENCES RANGES ON DELETE CASCADE;
+ALTER TABLE wsts ADD CONSTRAINT cWstsRivers FOREIGN KEY (river_id) REFERENCES rivers ON DELETE CASCADE;
+ALTER TABLE wsts ADD CONSTRAINT cWstsWstKinds FOREIGN KEY (kind) REFERENCES wst_kinds;
-- VIEWS
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/postgresql-drop-spatial.sql
--- a/flys-backend/doc/schema/postgresql-drop-spatial.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/postgresql-drop-spatial.sql Fri Mar 22 11:25:54 2013 +0100
@@ -9,9 +9,6 @@
DROP TABLE cross_section_tracks;
DROP SEQUENCE CROSS_SECTION_TRACKS_ID_SEQ;
-DROP TABLE lines;
-DROP SEQUENCE LINES_ID_SEQ;
-
DROP TABLE buildings;
DROP SEQUENCE BUILDINGS_ID_SEQ;
@@ -24,11 +21,11 @@
DROP TABLE dem;
DROP SEQUENCE DEM_ID_SEQ;
-DROP TABLE catchment;
-DROP SEQUENCE CATCHMENT_ID_SEQ;
+DROP TABLE hws_points;
+DROP SEQUENCE HWS_POINTS_ID_SEQ;
-DROP TABLE hws;
-DROP SEQUENCE HWS_ID_SEQ;
+DROP TABLE hws_lines;
+DROP SEQUENCE HWS_LINES_ID_SEQ;
DROP TABLE floodmaps;
DROP SEQUENCE FLOODMAPS_ID_SEQ;
@@ -42,4 +39,11 @@
DROP TABLE gauge_location;
DROP SEQUENCE GAUGE_LOCATION_ID_SEQ;
+DROP TABLE fed_states;
+DROP TABLE hws_kinds;
+DROP TABLE sobek_kinds;
+DROP TABLE sectie_kinds;
+DROP TABLE boundary_kinds;
+DROP TABLE axis_kinds;
+
COMMIT;
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/postgresql-migrate-dami.sql
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/doc/schema/postgresql-migrate-dami.sql Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,127 @@
+DROP table hws;
+DROP sequence HWS_ID_SEQ;
+DROP table lines;
+DROP sequence LINES_ID_SEQ;
+DROP table catchment;
+DROP sequence CATCHMENT_ID_SEQ;
+
+-- Static lookup tables for Hochwasserschutzanlagen
+CREATE TABLE hws_kinds (
+ id int PRIMARY KEY NOT NULL,
+ kind VARCHAR(64) NOT NULL
+);
+INSERT INTO hws_kinds (id, kind) VALUES (1, 'Durchlass');
+INSERT INTO hws_kinds (id, kind) VALUES (2, 'Damm');
+INSERT INTO hws_kinds (id, kind) VALUES (3, 'Graben');
+
+CREATE TABLE fed_states (
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(23) NOT NULL
+);
+INSERT INTO fed_states (id, name) VALUES (1, 'Bayern');
+INSERT INTO fed_states (id, name) VALUES (2, 'Hessen');
+INSERT INTO fed_states (id, name) VALUES (3, 'Niedersachsen');
+INSERT INTO fed_states (id, name) VALUES (4, 'Nordrhein-Westfalen');
+INSERT INTO fed_states (id, name) VALUES (5, 'Rheinland-Pfalz');
+INSERT INTO fed_states (id, name) VALUES (6, 'Saarland');
+INSERT INTO fed_states (id, name) VALUES (7, 'Schleswig-Holstein');
+INSERT INTO fed_states (id, name) VALUES (8, 'Brandenburg');
+INSERT INTO fed_states (id, name) VALUES (9, 'Mecklenburg-Vorpommern');
+INSERT INTO fed_states (id, name) VALUES (10, 'Thüringen');
+INSERT INTO fed_states (id, name) VALUES (11, 'Baden-Württemberg');
+INSERT INTO fed_states (id, name) VALUES (12, 'Sachsen-Anhalt');
+INSERT INTO fed_states (id, name) VALUES (13, 'Sachsen');
+INSERT INTO fed_states (id, name) VALUES (14, 'Berlin');
+INSERT INTO fed_states (id, name) VALUES (15, 'Bremen');
+INSERT INTO fed_states (id, name) VALUES (16, 'Hamburg');
+
+CREATE TABLE sectie_kinds (
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO sectie_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO sectie_kinds (id, name) VALUES (1, 'Flussschlauch');
+INSERT INTO sectie_kinds (id, name) VALUES (2, 'Uferbank');
+INSERT INTO sectie_kinds (id, name) VALUES (3, 'Ãberflutungsbereich');
+
+CREATE TABLE sobek_kinds (
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO sobek_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO sobek_kinds (id, name) VALUES (1, 'Stromführend');
+INSERT INTO sobek_kinds (id, name) VALUES (2, 'Stromspeichernd');
+
+CREATE TABLE boundary_kinds (
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO boundary_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO boundary_kinds (id, name) VALUES (1, 'BfG');
+INSERT INTO boundary_kinds (id, name) VALUES (2, 'Land');
+INSERT INTO boundary_kinds (id, name) VALUES (3, 'Sonstige');
+
+--Hydrologie/HW-Schutzanlagen/*Linien.shp
+CREATE SEQUENCE HWS_LINES_ID_SEQ;
+CREATE TABLE hws_lines (
+ id int PRIMARY KEY NOT NULL,
+ ogr_fid int,
+ kind_id int REFERENCES hws_kinds(id) DEFAULT 2,
+ fed_state_id int REFERENCES fed_states(id),
+ river_id int REFERENCES rivers(id),
+ name VARCHAR(256),
+ path VARCHAR(256),
+ offical INT DEFAULT 0,
+ agency VARCHAR(256),
+ range VARCHAR(256),
+ shore_side INT DEFAULT 0,
+ source VARCHAR(256),
+ status_date TIMESTAMP,
+ description VARCHAR(256)
+);
+SELECT AddGeometryColumn('hws_lines', 'geom', 31467, 'LINESTRING', 3);
+-- TODO: dike_km_from dike_km_to, are they geometries?
+
+ALTER TABLE hws_lines ALTER COLUMN id SET DEFAULT NEXTVAL('HWS_LINES_ID_SEQ');
+
+--Hydrologie/HW-Schutzanlagen/*Punkte.shp
+CREATE SEQUENCE HWS_POINTS_ID_SEQ;
+CREATE TABLE hws_points (
+ id int PRIMARY KEY NOT NULL,
+ ogr_fid int,
+ kind_id int REFERENCES hws_kinds(id) DEFAULT 2,
+ fed_state_id int REFERENCES fed_states(id),
+ river_id int REFERENCES rivers(id),
+ name VARCHAR,
+ path VARCHAR,
+ offical INT DEFAULT 0,
+ agency VARCHAR,
+ range VARCHAR,
+ shore_side INT DEFAULT 0,
+ source VARCHAR,
+ status_date VARCHAR,
+ description VARCHAR,
+ freeboard FLOAT8,
+ dike_km FLOAT8,
+ z FLOAT8,
+ z_target FLOAT8,
+ rated_level FLOAT8
+);
+SELECT AddGeometryColumn('hws_points', 'geom', 31467, 'POINT', 2);
+
+ALTER TABLE hws_points ALTER COLUMN id SET DEFAULT NEXTVAL('HWS_POINTS_ID_SEQ');
+
+ALTER TABLE hydr_boundaries_poly ADD COLUMN sectie INT REFERENCES sectie_kinds(id);
+ALTER TABLE hydr_boundaries_poly ADD COLUMN sobek INT REFERENCES sobek_kinds(id);
+ALTER TABLE hydr_boundaries_poly ADD FOREIGN KEY (kind) REFERENCES boundary_kinds(id);
+ALTER TABLE hydr_boundaries ADD COLUMN sectie INT REFERENCES sectie_kinds(id);
+ALTER TABLE hydr_boundaries ADD COLUMN sobek INT REFERENCES sobek_kinds(id);
+ALTER TABLE hydr_boundaries ADD FOREIGN KEY (kind) REFERENCES boundary_kinds(id);
+ALTER TABLE dem ADD COLUMN srid INT NOT NULL;
+ALTER TABLE dem ALTER COLUMN year_from DROP NOT NULL;
+ALTER TABLE dem ALTER COLUMN year_to DROP NOT NULL;
+ALTER TABLE dem ALTER COLUMN projection DROP NOT NULL;
+ALTER TABLE dem ALTER COLUMN path SET NOT NULL;
+
+COMMIT;
+
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/postgresql-minfo.sql
--- a/flys-backend/doc/schema/postgresql-minfo.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/postgresql-minfo.sql Fri Mar 22 11:25:54 2013 +0100
@@ -20,15 +20,19 @@
CONSTRAINT fk_unit FOREIGN KEY (unit_id) REFERENCES units(id)
);
-CREATE SEQUENCE BED_HEIGHT_TYPE_SEQ;
+-- lookup table for bedheight types
CREATE TABLE bed_height_type (
id int NOT NULL,
- name VARCHAR(16) NOT NULL,
- description VARCHAR(255),
+ name VARCHAR(64) NOT NULL,
PRIMARY KEY(id)
);
-
+INSERT INTO bed_height_type VALUES (1, 'Querprofile');
+INSERT INTO bed_height_type VALUES (2, 'Flächenpeilung');
+INSERT INTO bed_height_type VALUES (3, 'Flächen- u. Querprofilpeilungen');
+INSERT INTO bed_height_type VALUES (4, 'DGM');
+INSERT INTO bed_height_type VALUES (5, 'TIN');
+INSERT INTO bed_height_type VALUES (6, 'Modell');
CREATE SEQUENCE BED_HEIGHT_SINGLE_ID_SEQ;
@@ -46,12 +50,12 @@
evaluation_by VARCHAR(255),
description VARCHAR(255),
PRIMARY KEY(id),
- CONSTRAINT fk_bed_single_river_id FOREIGN KEY (river_id) REFERENCES rivers(id),
+ CONSTRAINT fk_bed_single_river_id FOREIGN KEY (river_id) REFERENCES rivers(id) ON DELETE CASCADE,
CONSTRAINT fk_type FOREIGN KEY (type_id) REFERENCES bed_height_type(id),
CONSTRAINT fk_location_system FOREIGN KEY (location_system_id) REFERENCES location_system(id),
CONSTRAINT fk_cur_elevation_model FOREIGN KEY (cur_elevation_model_id) REFERENCES elevation_model(id),
CONSTRAINT fk_old_elevation_model FOREIGN KEY (old_elevation_model_id) REFERENCES elevation_model(id),
- CONSTRAINT fk_range FOREIGN KEY (range_id) REFERENCES ranges(id)
+ CONSTRAINT fk_range FOREIGN KEY (range_id) REFERENCES ranges(id) ON DELETE CASCADE
);
@@ -72,7 +76,7 @@
CONSTRAINT fk_time_interval FOREIGN KEY (time_interval_id) REFERENCES time_intervals(id),
CONSTRAINT fk_epoch_cur_elevation_model FOREIGN KEY (cur_elevation_model_id) REFERENCES elevation_model(id),
CONSTRAINT fk_epoch_old_elevation_model FOREIGN KEY (old_elevation_model_id) REFERENCES elevation_model(id),
- CONSTRAINT fk_epoch_range FOREIGN KEY (range_id) REFERENCES ranges(id)
+ CONSTRAINT fk_epoch_range FOREIGN KEY (range_id) REFERENCES ranges(id) ON DELETE CASCADE
);
@@ -88,7 +92,7 @@
sounding_width NUMERIC,
width NUMERIC,
PRIMARY KEY(id),
- CONSTRAINT fk_bed_single_values_parent FOREIGN KEY (bed_height_single_id) REFERENCES bed_height_single(id)
+ CONSTRAINT fk_bed_single_values_parent FOREIGN KEY (bed_height_single_id) REFERENCES bed_height_single(id) ON DELETE CASCADE
);
@@ -100,7 +104,7 @@
station NUMERIC NOT NULL,
height NUMERIC,
PRIMARY KEY(id),
- CONSTRAINT fk_bed_epoch_values_parent FOREIGN KEY (bed_height_epoch_id) REFERENCES bed_height_epoch(id)
+ CONSTRAINT fk_bed_epoch_values_parent FOREIGN KEY (bed_height_epoch_id) REFERENCES bed_height_epoch(id) ON DELETE CASCADE
);
@@ -125,7 +129,7 @@
unit_id int NOT NULL,
description VARCHAR(256),
PRIMARY KEY(id),
- CONSTRAINT fk_sd_river_id FOREIGN KEY (river_id) REFERENCES rivers(id),
+ CONSTRAINT fk_sd_river_id FOREIGN KEY (river_id) REFERENCES rivers(id) ON DELETE CASCADE,
CONSTRAINT fk_sd_depth_id FOREIGN KEY (depth_id) REFERENCES depths(id),
CONSTRAINT fk_sd_unit_id FOREIGN KEY (unit_id) REFERENCES units(id)
);
@@ -141,7 +145,7 @@
description VARCHAR(256),
year int,
PRIMARY KEY(id),
- CONSTRAINT fk_sdv_sediment_density_id FOREIGN KEY(sediment_density_id) REFERENCES sediment_density(id)
+ CONSTRAINT fk_sdv_sediment_density_id FOREIGN KEY(sediment_density_id) REFERENCES sediment_density(id) ON DELETE CASCADE
);
@@ -152,7 +156,7 @@
river_id int NOT NULL,
unit_id int NOT NULL,
PRIMARY KEY(id),
- CONSTRAINT fk_mw_river_id FOREIGN KEY(river_id) REFERENCES rivers(id),
+ CONSTRAINT fk_mw_river_id FOREIGN KEY(river_id) REFERENCES rivers(id) ON DELETE CASCADE,
CONSTRAINT fk_mw_unit_id FOREIGN KEY(unit_id) REFERENCES units(id)
);
@@ -166,7 +170,7 @@
width NUMERIC NOT NULL,
description VARCHAR(256),
PRIMARY KEY(id),
- CONSTRAINT fk_mwv_morphologic_width_id FOREIGN KEY (morphologic_width_id) REFERENCES morphologic_width(id)
+ CONSTRAINT fk_mwv_morphologic_width_id FOREIGN KEY (morphologic_width_id) REFERENCES morphologic_width(id) ON DELETE CASCADE
);
@@ -180,7 +184,7 @@
lower_discharge VARCHAR(16) NOT NULL,
upper_discharge VARCHAR(16),
PRIMARY KEY(id),
- CONSTRAINT fk_dz_river_id FOREIGN KEY (river_id) REFERENCES rivers(id)
+ CONSTRAINT fk_dz_river_id FOREIGN KEY (river_id) REFERENCES rivers(id) ON DELETE CASCADE
);
@@ -188,12 +192,10 @@
CREATE TABLE flow_velocity_model (
id int NOT NULL,
- river_id int NOT NULL,
discharge_zone_id int NOT NULL,
description VARCHAR(256),
PRIMARY KEY (id),
- CONSTRAINT fk_fvm_river_id FOREIGN KEY (river_id) REFERENCES rivers(id),
- CONSTRAINT fk_fvm_discharge_zone_id FOREIGN KEY (discharge_zone_id) REFERENCES discharge_zone (id)
+ CONSTRAINT fk_fvm_discharge_zone_id FOREIGN KEY (discharge_zone_id) REFERENCES discharge_zone (id) ON DELETE CASCADE
);
@@ -208,7 +210,7 @@
main_channel NUMERIC NOT NULL,
shear_stress NUMERIC NOT NULL,
PRIMARY KEY(id),
- CONSTRAINT fk_fvv_flow_velocity_model_id FOREIGN KEY (flow_velocity_model_id) REFERENCES flow_velocity_model(id)
+ CONSTRAINT fk_fvv_flow_velocity_model_id FOREIGN KEY (flow_velocity_model_id) REFERENCES flow_velocity_model(id) ON DELETE CASCADE
);
@@ -220,7 +222,7 @@
river_id int NOT NULL,
description VARCHAR(256),
PRIMARY KEY (id),
- CONSTRAINT fk_fvm_rivers_id FOREIGN KEY (river_id) REFERENCES rivers(id)
+ CONSTRAINT fk_fvm_rivers_id FOREIGN KEY (river_id) REFERENCES rivers(id) ON DELETE CASCADE
);
CREATE SEQUENCE FV_MEASURE_VALUES_ID_SEQ;
@@ -235,7 +237,7 @@
v NUMERIC NOT NULL,
description VARCHAR(256),
PRIMARY KEY (id),
- CONSTRAINT fk_fvmv_measurements_id FOREIGN KEY (measurements_id) REFERENCES flow_velocity_measurements (id)
+ CONSTRAINT fk_fvmv_measurements_id FOREIGN KEY (measurements_id) REFERENCES flow_velocity_measurements (id) ON DELETE CASCADE
);
@@ -246,9 +248,7 @@
name VARCHAR(64) NOT NULL,
lower NUMERIC,
upper NUMERIC,
- unit_id int,
PRIMARY KEY (id),
- CONSTRAINT fk_gf_unit_id FOREIGN KEY (unit_id) REFERENCES units(id)
);
@@ -262,7 +262,7 @@
time_interval_id int NOT NULL,
description VARCHAR(256),
PRIMARY KEY (id),
- CONSTRAINT fk_sy_river_id FOREIGN KEY (river_id) REFERENCES rivers(id),
+ CONSTRAINT fk_sy_river_id FOREIGN KEY (river_id) REFERENCES rivers(id) ON DELETE CASCADE,
CONSTRAINT fk_sy_grain_fraction_id FOREIGN KEY (grain_fraction_id) REFERENCES grain_fraction(id),
CONSTRAINT fk_sy_unit_id FOREIGN KEY (unit_id) REFERENCES units(id),
CONSTRAINT fk_sy_time_interval_id FOREIGN KEY (time_interval_id) REFERENCES time_intervals(id)
@@ -277,101 +277,29 @@
station NUMERIC NOT NULL,
value NUMERIC NOT NULL,
PRIMARY KEY (id),
- CONSTRAINT fk_syv_sediment_yield_id FOREIGN KEY (sediment_yield_id) REFERENCES sediment_yield(id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_ID_SEQ;
-
-CREATE TABLE waterlevel (
- id int NOT NULL,
- river_id int NOT NULL,
- unit_id int NOT NULL,
- description VARCHAR(256),
- PRIMARY KEY (id),
- CONSTRAINT fk_w_river_id FOREIGN KEY (river_id) REFERENCES rivers(id),
- CONSTRAINT fk_w_unit_id FOREIGN KEY (unit_id) REFERENCES units(id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_Q_RANGES_ID_SEQ;
-
-CREATE TABLE waterlevel_q_range (
- id int NOT NULL,
- waterlevel_id int NOT NULL,
- q NUMERIC NOT NULL,
- PRIMARY KEY (id),
- CONSTRAINT fk_wqr_waterlevel_id FOREIGN KEY (waterlevel_id) REFERENCES waterlevel(id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_VALUES_ID_SEQ;
-
-CREATE TABLE waterlevel_values (
- id int NOT NULL,
- waterlevel_q_range_id int NOT NULL,
- station NUMERIC NOT NULL,
- w NUMERIC NOT NULL,
- PRIMARY KEY (id),
- CONSTRAINT fk_wv_waterlevel_q_range_id FOREIGN KEY (waterlevel_q_range_id) REFERENCES waterlevel_q_range(id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_DIFFERENCE_ID_SEQ;
-
-CREATE TABLE waterlevel_difference (
- id int NOT NULL,
- river_id int NOT NULL,
- unit_id int NOT NULL,
- description VARCHAR(256),
- PRIMARY KEY (id),
- CONSTRAINT fk_wd_river_id FOREIGN KEY (river_id) REFERENCES rivers (id),
- CONSTRAINT fk_wd_unit_id FOREIGN KEY (unit_id) REFERENCES units(id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_DIFF_COLUMN_ID_SEQ;
-
-CREATE TABLE waterlevel_difference_column (
- id int NOT NULL,
- difference_id int NOT NULL,
- description VARCHAR(256),
- PRIMARY KEY (id),
- CONSTRAINT fk_wdc_difference_id FOREIGN KEY (difference_id) REFERENCES waterlevel_difference (id)
-);
-
-
-CREATE SEQUENCE WATERLEVEL_DIFF_VALUES_ID_SEQ;
-
-CREATE TABLE waterlevel_difference_values (
- id int NOT NULL,
- column_id int NOT NULL,
- station NUMERIC NOT NULL,
- value NUMERIC NOT NULL,
- PRIMARY KEY (id),
- CONSTRAINT fk_wdv_column_id FOREIGN KEY (column_id) REFERENCES waterlevel_difference_column (id)
+ CONSTRAINT fk_syv_sediment_yield_id FOREIGN KEY (sediment_yield_id) REFERENCES sediment_yield(id) ON DELETE CASCADE
);
CREATE SEQUENCE MEASUREMENT_STATION_ID_SEQ;
CREATE TABLE measurement_station (
- id int NOT NULL,
- name VARCHAR(256) NOT NULL,
- river_id int NOT NULL,
- station NUMERIC NOT NULL,
- range_id int NOT NULL,
- measurement_type VARCHAR(64) NOT NULL,
- riverside VARCHAR(16),
- reference_gauge_id int,
- observation_timerange_id int,
- operator VARCHAR(64),
- comment VARCHAR(512),
- PRIMARY KEY (id),
- CONSTRAINT fk_ms_river_id FOREIGN KEY (river_id) REFERENCES rivers(id),
- CONSTRAINT fk_ms_range_id FOREIGN KEY (range_id) REFERENCES ranges(id),
- CONSTRAINT fk_ms_reference_gauge_id FOREIGN KEY (reference_gauge_id) REFERENCES gauges(id),
- CONSTRAINT fk_ms_observation_timerange_id FOREIGN KEY (observation_timerange_id) REFERENCES time_intervals(id),
- UNIQUE (river_id, station)
+ id int NOT NULL,
+ name VARCHAR(256) NOT NULL,
+ river_id int NOT NULL,
+ station NUMERIC NOT NULL,
+ range_id int NOT NULL,
+ measurement_type VARCHAR(64) NOT NULL,
+ riverside VARCHAR(16),
+ reference_gauge_id int,
+ observation_timerange_id int,
+ operator VARCHAR(64),
+ description VARCHAR(512),
+ PRIMARY KEY (id),
+ CONSTRAINT fk_ms_river_id FOREIGN KEY (river_id) REFERENCES rivers(id) ON DELETE CASCADE,
+ CONSTRAINT fk_ms_range_id FOREIGN KEY (range_id) REFERENCES ranges(id) ON DELETE CASCADE,
+ CONSTRAINT fk_ms_reference_gauge_id FOREIGN KEY (reference_gauge_id) REFERENCES gauges(id) ON DELETE CASCADE,
+ CONSTRAINT fk_ms_observation_timerange_id FOREIGN KEY (observation_timerange_id) REFERENCES time_intervals(id),
+ UNIQUE (river_id, station)
);
@@ -383,7 +311,7 @@
time_interval_id int NOT NULL,
description VARCHAR(256),
PRIMARY KEY (id),
- CONSTRAINT fk_sqr_river_id FOREIGN KEY (river_id) REFERENCES rivers(id),
+ CONSTRAINT fk_sqr_river_id FOREIGN KEY (river_id) REFERENCES rivers(id) ON DELETE CASCADE,
CONSTRAINT fk_sqr_tinterval_id FOREIGN KEY (time_interval_id) REFERENCES time_intervals(id)
);
@@ -400,6 +328,6 @@
a NUMERIC NOT NULL,
b NUMERIC NOT NULL,
PRIMARY KEY (id),
- CONSTRAINT fk_sqr_id FOREIGN KEY (sq_relation_id) REFERENCES sq_relation(id)
+ CONSTRAINT fk_sqr_id FOREIGN KEY (sq_relation_id) REFERENCES sq_relation(id) ON DELETE CASCADE
);
COMMIT;
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/postgresql-setup.sh
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/doc/schema/postgresql-setup.sh Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,27 @@
+#!/bin/bash
+
+# $1: user name and password for new DB (equals DB name)
+# $2: path to directory with schema-scripts
+# $3: host
+
+# run as user postgres (postgresql super-user)
+# it is assumed that the owner of the DB has the same name as the DB!
+
+# create PostGIS-DB
+createuser -S -D -R $1
+createdb $1
+createlang plpgsql $1
+# Appears e.g. as /usr/share/postgresql/contrib/postgis-1.5/ on other systems.
+psql -d $1 -f /usr/share/postgresql/8.4/contrib/postgis-1.5/postgis.sql
+psql -d $1 -f /usr/share/postgresql/8.4/contrib/postgis-1.5/spatial_ref_sys.sql
+psql -d $1 -c "ALTER USER $1 WITH PASSWORD '$1';"
+psql -d $1 -c "GRANT ALL ON geometry_columns TO $1; GRANT ALL ON geography_columns TO $1; GRANT ALL ON spatial_ref_sys TO $1;"
+
+# add credentials to .pgpass (or create .pgpass)
+echo "*:*:$1:$1:$1" >> ~/.pgpass
+chmod 0600 ~/.pgpass
+
+# apply schema-scripts
+psql -d $1 -U $1 -h $3 -f $2/postgresql.sql
+psql -d $1 -U $1 -h $3 -f $2/postgresql-spatial.sql
+psql -d $1 -U $1 -h $3 -f $2/postgresql-minfo.sql
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/postgresql-spatial.sql
--- a/flys-backend/doc/schema/postgresql-spatial.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/postgresql-spatial.sql Fri Mar 22 11:25:54 2013 +0100
@@ -1,15 +1,23 @@
BEGIN;
+CREATE TABLE axis_kinds(
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(64)
+);
+INSERT INTO axis_kinds(id, name) VALUES (0, 'Unbekannt');
+INSERT INTO axis_kinds(id, name) VALUES (1, 'Aktuell');
+INSERT INTO axis_kinds(id, name) VALUES (2, 'Sonstige');
+
-- Geodaesie/Flussachse+km/achse
CREATE SEQUENCE RIVER_AXES_ID_SEQ;
CREATE TABLE river_axes (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
- kind int NOT NULL DEFAULT 0,
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
+ kind_id int REFERENCES axis_kinds(id) NOT NULL DEFAULT 0,
name VARCHAR(64),
path VARCHAR(256)
);
-SELECT AddGeometryColumn('river_axes', 'geom', 31467, 'LINESTRING', 2);
+SELECT AddGeometryColumn('river_axes', 'geom', 31467, 'MULTILINESTRING', 2);
ALTER TABLE river_axes ALTER COLUMN id SET DEFAULT NEXTVAL('RIVER_AXES_ID_SEQ');
@@ -18,8 +26,8 @@
CREATE SEQUENCE RIVER_AXES_KM_ID_SEQ;
CREATE TABLE river_axes_km (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
- km NUMERIC NOT NULL,
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
+ km FLOAT8 NOT NULL,
name VARCHAR(64),
path VARCHAR(256)
);
@@ -28,12 +36,20 @@
--Geodaesie/Querprofile/QP-Spuren/qps.shp
+CREATE TABLE cross_section_track_kinds(
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(64)
+);
+INSERT INTO cross_section_track_kinds(id, name) VALUES (0, 'Sonstige');
+INSERT INTO cross_section_track_kinds(id, name) VALUES (1, 'Aktuell');
+
CREATE SEQUENCE CROSS_SECTION_TRACKS_ID_SEQ;
CREATE TABLE cross_section_tracks (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
- km NUMERIC NOT NULL,
- z NUMERIC NOT NULL DEFAULT 0,
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
+ kind_id int REFERENCES cross_section_track_kinds(id) NOT NULL DEFAULT 0,
+ km FLOAT8 NOT NULL,
+ z FLOAT8 NOT NULL DEFAULT 0,
name VARCHAR(64),
path VARCHAR(256)
);
@@ -41,28 +57,11 @@
ALTER TABLE cross_section_tracks ALTER COLUMN id SET DEFAULT NEXTVAL('CROSS_SECTION_TRACKS_ID_SEQ');
--- Geodaesie/Linien/rohre-und-spreen
-CREATE SEQUENCE LINES_ID_SEQ;
-CREATE TABLE lines (
- id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
- kind VARCHAR(16) NOT NULL,
- z NUMERIC DEFAULT 0,
- name VARCHAR(64),
- path VARCHAR(256)
-);
-SELECT AddGeometryColumn('lines', 'geom', 31467, 'LINESTRING', 3);
-ALTER TABLE lines ALTER COLUMN id SET DEFAULT NEXTVAL('LINES_ID_SEQ');
--- 'kind':
--- 0: ROHR1
--- 1: DAMM
-
-
-- Geodaesie/Bauwerke/Wehre.shp
CREATE SEQUENCE BUILDINGS_ID_SEQ;
CREATE TABLE buildings (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
name VARCHAR(256),
path VARCHAR(256)
);
@@ -74,10 +73,10 @@
CREATE SEQUENCE FIXPOINTS_ID_SEQ;
CREATE TABLE fixpoints (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
- x int,
- y int,
- km NUMERIC NOT NULL,
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
+ x FLOAT8,
+ y FLOAT8,
+ km FLOAT8 NOT NULL,
HPGP VARCHAR(2),
name VARCHAR(64),
path VARCHAR(256)
@@ -87,10 +86,18 @@
-- Hydrologie/Hydr. Grenzen/talaue.shp
+CREATE TABLE floodplain_kinds(
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(64)
+);
+INSERT INTO floodplain_kinds(id, name) VALUES (0, 'Sonstige');
+INSERT INTO floodplain_kinds(id, name) VALUES (1, 'Aktuell');
+
CREATE SEQUENCE FLOODPLAIN_ID_SEQ;
CREATE TABLE floodplain (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
+ kind_id int REFERENCES floodplain_kinds(id) NOT NULL DEFAULT 0,
name VARCHAR(64),
path VARCHAR(256)
);
@@ -102,111 +109,217 @@
CREATE SEQUENCE DEM_ID_SEQ;
CREATE TABLE dem (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
-- XXX Should we use the ranges table instead?
- name VARCHAR(64),
- lower NUMERIC,
- upper NUMERIC,
- year_from VARCHAR(32) NOT NULL,
- year_to VARCHAR(32) NOT NULL,
- projection VARCHAR(32) NOT NULL,
- elevation_state VARCHAR(32),
- format VARCHAR(32),
- border_break BOOLEAN NOT NULL DEFAULT FALSE,
- resolution VARCHAR(16),
- description VARCHAR(256),
- path VARCHAR(256)
+ name VARCHAR(64),
+ range_id INT REFERENCES ranges(id),
+ time_interval_id INT REFERENCES time_intervals(id),
+ projection VARCHAR(32),
+ srid int NOT NULL,
+ elevation_state VARCHAR(32),
+ format VARCHAR(32),
+ border_break BOOLEAN NOT NULL DEFAULT FALSE,
+ resolution VARCHAR(16),
+ description VARCHAR(256),
+ path VARCHAR(256) NOT NULL
);
ALTER TABLE dem ALTER COLUMN id SET DEFAULT NEXTVAL('DEM_ID_SEQ');
--- Hydrologie/Einzugsgebiete/EZG.shp
-CREATE SEQUENCE CATCHMENT_ID_SEQ;
-CREATE TABLE catchment (
+-- Static lookup tables for Hochwasserschutzanlagen
+CREATE TABLE hws_kinds (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
- area NUMERIC,
+ kind VARCHAR(64) NOT NULL
+);
+INSERT INTO hws_kinds (id, kind) VALUES (1, 'Durchlass');
+INSERT INTO hws_kinds (id, kind) VALUES (2, 'Damm');
+INSERT INTO hws_kinds (id, kind) VALUES (3, 'Graben');
+
+CREATE TABLE fed_states (
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(23) NOT NULL
+);
+INSERT INTO fed_states (id, name) VALUES (1, 'Bayern');
+INSERT INTO fed_states (id, name) VALUES (2, 'Hessen');
+INSERT INTO fed_states (id, name) VALUES (3, 'Niedersachsen');
+INSERT INTO fed_states (id, name) VALUES (4, 'Nordrhein-Westfalen');
+INSERT INTO fed_states (id, name) VALUES (5, 'Rheinland-Pfalz');
+INSERT INTO fed_states (id, name) VALUES (6, 'Saarland');
+INSERT INTO fed_states (id, name) VALUES (7, 'Schleswig-Holstein');
+INSERT INTO fed_states (id, name) VALUES (8, 'Brandenburg');
+INSERT INTO fed_states (id, name) VALUES (9, 'Mecklenburg-Vorpommern');
+INSERT INTO fed_states (id, name) VALUES (10, 'Thüringen');
+INSERT INTO fed_states (id, name) VALUES (11, 'Baden-Württemberg');
+INSERT INTO fed_states (id, name) VALUES (12, 'Sachsen-Anhalt');
+INSERT INTO fed_states (id, name) VALUES (13, 'Sachsen');
+INSERT INTO fed_states (id, name) VALUES (14, 'Berlin');
+INSERT INTO fed_states (id, name) VALUES (15, 'Bremen');
+INSERT INTO fed_states (id, name) VALUES (16, 'Hamburg');
+
+--Hydrologie/HW-Schutzanlagen/*Linien.shp
+CREATE SEQUENCE HWS_LINES_ID_SEQ;
+CREATE TABLE hws_lines (
+ id int PRIMARY KEY NOT NULL,
+ ogr_fid int,
+ kind_id int REFERENCES hws_kinds(id) DEFAULT 2,
+ fed_state_id int REFERENCES fed_states(id),
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
name VARCHAR(256),
- path VARCHAR(256)
+ path VARCHAR(256),
+ official INT DEFAULT 0,
+ agency VARCHAR(256),
+ range VARCHAR(256),
+ shore_side INT DEFAULT 0,
+ source VARCHAR(256),
+ status_date TIMESTAMP,
+ description VARCHAR(256)
);
-SELECT AddGeometryColumn('catchment','geom',31467,'POLYGON',2);
-ALTER TABLE catchment ALTER COLUMN id SET DEFAULT NEXTVAL('CATCHMENT_ID_SEQ');
+SELECT AddGeometryColumn('hws_lines', 'geom', 31467, 'MULTILINESTRING', 3);
+-- TODO: dike_km_from dike_km_to, are they geometries?
+ALTER TABLE hws_lines ALTER COLUMN id SET DEFAULT NEXTVAL('HWS_LINES_ID_SEQ');
---Hydrologie/HW-Schutzanlagen/hws.shp
-CREATE SEQUENCE HWS_ID_SEQ;
-CREATE TABLE hws (
+--Hydrologie/HW-Schutzanlagen/*Punkte.shp
+CREATE SEQUENCE HWS_POINTS_ID_SEQ;
+CREATE TABLE hws_points (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
- hws_facility VARCHAR(256),
- type VARCHAR(256),
- name VARCHAR(64),
- path VARCHAR(256)
+ ogr_fid int,
+ kind_id int REFERENCES hws_kinds(id) DEFAULT 2,
+ fed_state_id int REFERENCES fed_states(id),
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
+ name VARCHAR,
+ path VARCHAR,
+ official INT DEFAULT 0,
+ agency VARCHAR,
+ range VARCHAR,
+ shore_side INT DEFAULT 0,
+ source VARCHAR,
+ status_date VARCHAR,
+ description VARCHAR,
+ freeboard FLOAT8,
+ dike_km FLOAT8,
+ z FLOAT8,
+ z_target FLOAT8,
+ rated_level FLOAT8
);
-SELECT AddGeometryColumn('hws','geom',31467,'LINESTRING',2);
-ALTER TABLE hws ALTER COLUMN id SET DEFAULT NEXTVAL('HWS_ID_SEQ');
+SELECT AddGeometryColumn('hws_points', 'geom', 31467, 'POINT', 2);
+ALTER TABLE hws_points ALTER COLUMN id SET DEFAULT NEXTVAL('HWS_POINTS_ID_SEQ');
--
--Hydrologie/UeSG
---
--- 'kind' can be one of:
--- 200 = Messung
--- 111 = Berechnung->Aktuell->BfG
--- 112 = Berechnung->Aktuell->Land
--- 121 = Berechnung->Potenziell->BfG
--- 122 = Berechnung->Potenziell->Land
---
+CREATE TABLE floodmap_kinds (
+ id int PRIMARY KEY NOT NULL,
+ name varchar(64) NOT NULL
+);
+INSERT INTO floodmap_kinds VALUES (200, 'Messung');
+INSERT INTO floodmap_kinds VALUES (111, 'Berechnung-Aktuell-BfG');
+INSERT INTO floodmap_kinds VALUES (112, 'Berechnung-Aktuell-Bundesländer');
+INSERT INTO floodmap_kinds VALUES (121, 'Berechnung-Potenziell-BfG');
+INSERT INTO floodmap_kinds VALUES (122, 'Berechnung-Potenziell-Bundesländer');
+
CREATE SEQUENCE FLOODMAPS_ID_SEQ;
CREATE TABLE floodmaps (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
name varchar(64) NOT NULL,
- kind int NOT NULL,
- diff real,
+ kind int NOT NULL REFERENCES floodmap_kinds(id),
+ diff FLOAT8,
count int,
- area real,
- perimeter real,
- path VARCHAR(256)
+ area FLOAT8,
+ perimeter FLOAT8,
+ path VARCHAR(256),
+ source varchar(64)
);
SELECT AddGeometryColumn('floodmaps', 'geom', 31467, 'MULTIPOLYGON', 2);
ALTER TABLE floodmaps DROP CONSTRAINT enforce_geotype_geom;
ALTER TABLE floodmaps ADD CONSTRAINT enforce_geotype_geom CHECK (geometrytype(geom) = 'POLYGON'::text OR geometrytype(geom) = 'MULTIPOLYGON'::text);
ALTER TABLE floodmaps ALTER COLUMN id SET DEFAULT NEXTVAL('FLOODMAPS_ID_SEQ');
+CREATE TABLE sectie_kinds (
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO sectie_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO sectie_kinds (id, name) VALUES (1, 'Flussschlauch');
+INSERT INTO sectie_kinds (id, name) VALUES (2, 'Uferbank');
+INSERT INTO sectie_kinds (id, name) VALUES (3, 'Ãberflutungsbereich');
+
+CREATE TABLE sobek_kinds (
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO sobek_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO sobek_kinds (id, name) VALUES (1, 'Stromführend');
+INSERT INTO sobek_kinds (id, name) VALUES (2, 'Stromspeichernd');
+
+CREATE TABLE boundary_kinds (
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(64) NOT NULL
+);
+INSERT INTO boundary_kinds (id, name) VALUES (0, 'Unbekannt');
+INSERT INTO boundary_kinds (id, name) VALUES (1, 'BfG');
+INSERT INTO boundary_kinds (id, name) VALUES (2, 'Land');
+INSERT INTO boundary_kinds (id, name) VALUES (3, 'Sonstige');
CREATE SEQUENCE HYDR_BOUNDARIES_ID_SEQ;
CREATE TABLE hydr_boundaries (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
name VARCHAR(255),
- kind int,
+ kind int REFERENCES boundary_kinds(id),
+ sectie int REFERENCES sectie_kinds(id),
+ sobek int REFERENCES sobek_kinds(id),
path VARCHAR(256)
);
-SELECT AddGeometryColumn('hydr_boundaries','geom',31467,'LINESTRING',3);
+SELECT AddGeometryColumn('hydr_boundaries','geom',31467,'MULTILINESTRING',3);
ALTER TABLE hydr_boundaries ALTER COLUMN id SET DEFAULT NEXTVAL('HYDR_BOUNDARIES_ID_SEQ');
CREATE SEQUENCE HYDR_BOUNDARIES_POLY_ID_SEQ;
CREATE TABLE hydr_boundaries_poly (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
name VARCHAR(255),
- kind int,
+ kind int REFERENCES boundary_kinds(id),
+ sectie int REFERENCES sectie_kinds(id),
+ sobek int REFERENCES sobek_kinds(id),
path VARCHAR(256)
);
-SELECT AddGeometryColumn('hydr_boundaries_poly','geom',31467,'POLYGON',3);
+SELECT AddGeometryColumn('hydr_boundaries_poly','geom',31467,'MULTIPOLYGON',3);
ALTER TABLE hydr_boundaries_poly ALTER COLUMN id SET DEFAULT NEXTVAL('HYDR_BOUNDARIES_POLY_ID_SEQ');
CREATE SEQUENCE GAUGE_LOCATION_ID_SEQ;
CREATE TABLE gauge_location (
id int PRIMARY KEY NOT NULL,
- river_id int REFERENCES rivers(id),
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
name VARCHAR(255),
path VARCHAR(256)
);
SELECT AddGeometryColumn('gauge_location','geom',31467,'POINT',2);
ALTER TABLE gauge_location ALTER COLUMN id SET DEFAULT NEXTVAL('GAUGE_LOCATION_ID_SEQ');
+
+CREATE TABLE jetty_kinds(
+ id int PRIMARY KEY NOT NULL,
+ name VARCHAR(64)
+);
+INSERT INTO jetty_kinds VALUES (0, 'Buhnenkopf');
+INSERT INTO jetty_kinds VALUES (1, 'BuhnenfuÃ');
+INSERT INTO jetty_kinds VALUES (2, 'Buhnenwurzel');
+
+CREATE SEQUENCE JETTIES_ID_SEQ;
+CREATE TABLE jetties (
+ id int PRIMARY KEY NOT NULL,
+ river_id int REFERENCES rivers(id) ON DELETE CASCADE,
+ path VARCHAR(256),
+ kind_id int REFERENCES jetty_kinds(id),
+ km FLOAT8,
+ z FLOAT8
+);
+SELECT AddGeometryColumn('jetties','geom',31467,'POINT',2);
+ALTER TABLE jetties ALTER COLUMN id SET DEFAULT NEXTVAL('JETTIES_ID_SEQ');
+
+
COMMIT;
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/doc/schema/postgresql.sql
--- a/flys-backend/doc/schema/postgresql.sql Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/doc/schema/postgresql.sql Fri Mar 22 11:25:54 2013 +0100
@@ -31,12 +31,13 @@
CREATE TABLE ranges (
id int PRIMARY KEY NOT NULL,
- river_id int NOT NULL REFERENCES rivers(id),
+ river_id int NOT NULL REFERENCES rivers(id) ON DELETE CASCADE,
a NUMERIC NOT NULL,
b NUMERIC,
UNIQUE (river_id, a, b)
);
+
-- Lage 'links', 'rechts', etc.
CREATE SEQUENCE POSITIONS_ID_SEQ;
@@ -68,7 +69,7 @@
CREATE TABLE annotations (
id int PRIMARY KEY NOT NULL,
- range_id int NOT NULL REFERENCES ranges(id),
+ range_id int NOT NULL REFERENCES ranges(id) ON DELETE CASCADE,
attribute_id int NOT NULL REFERENCES attributes(id),
position_id int REFERENCES positions(id),
edge_id int REFERENCES edges(id),
@@ -81,15 +82,16 @@
CREATE TABLE gauges (
id int PRIMARY KEY NOT NULL,
name VARCHAR(256) NOT NULL,
- river_id int NOT NULL REFERENCES rivers(id),
- station NUMERIC NOT NULL UNIQUE,
+ -- remove river id here because range_id references river already
+ river_id int NOT NULL REFERENCES rivers(id) ON DELETE CASCADE,
+ station NUMERIC NOT NULL,
aeo NUMERIC NOT NULL,
- official_number int8 UNIQUE,
+ official_number int8 UNIQUE,
-- Pegelnullpunkt
datum NUMERIC NOT NULL,
-- Streckengueltigkeit
- range_id int REFERENCES ranges (id),
+ range_id int NOT NULL REFERENCES ranges (id) ON DELETE CASCADE,
UNIQUE (name, river_id),
UNIQUE (river_id, station)
@@ -123,12 +125,13 @@
CHECK (start_time <= stop_time)
);
+
-- Stammdaten
CREATE SEQUENCE MAIN_VALUES_ID_SEQ;
CREATE TABLE main_values (
id int PRIMARY KEY NOT NULL,
- gauge_id int NOT NULL REFERENCES gauges(id),
+ gauge_id int NOT NULL REFERENCES gauges(id) ON DELETE CASCADE,
named_value_id int NOT NULL REFERENCES named_main_values(id),
value NUMERIC NOT NULL,
@@ -143,7 +146,7 @@
CREATE TABLE discharge_tables (
id int PRIMARY KEY NOT NULL,
- gauge_id int NOT NULL REFERENCES gauges(id),
+ gauge_id int NOT NULL REFERENCES gauges(id) ON DELETE CASCADE,
description VARCHAR(256) NOT NULL,
bfg_id VARCHAR(50),
kind int NOT NULL DEFAULT 0,
@@ -158,7 +161,7 @@
CREATE TABLE discharge_table_values (
id int PRIMARY KEY NOT NULL,
- table_id int NOT NULL REFERENCES discharge_tables(id),
+ table_id int NOT NULL REFERENCES discharge_tables(id) ON DELETE CASCADE,
q NUMERIC NOT NULL,
w NUMERIC NOT NULL,
@@ -166,13 +169,28 @@
);
-- WST files
+--lookup table for wst kinds
+CREATE TABLE wst_kinds (
+ id int PRIMARY KEY NOT NULL,
+ kind VARCHAR(64) NOT NULL
+);
+INSERT INTO wst_kinds (id, kind) VALUES (0, 'basedata');
+INSERT INTO wst_kinds (id, kind) VALUES (1, 'basedata_additionals_marks');
+INSERT INTO wst_kinds (id, kind) VALUES (2, 'basedata_fixations_wst');
+INSERT INTO wst_kinds (id, kind) VALUES (3, 'basedata_officials');
+INSERT INTO wst_kinds (id, kind) VALUES (4, 'basedata_heightmarks-points-relative_points');
+INSERT INTO wst_kinds (id, kind) VALUES (5, 'basedata_flood-protections_relative_points');
+INSERT INTO wst_kinds (id, kind) VALUES (6, 'morpho_waterlevel-differences');
+INSERT INTO wst_kinds (id, kind) VALUES (7, 'morpho_waterlevels');
+
+
CREATE SEQUENCE WSTS_ID_SEQ;
CREATE TABLE wsts (
id int PRIMARY KEY NOT NULL,
- river_id int NOT NULL REFERENCES rivers(id),
+ river_id int NOT NULL REFERENCES rivers(id) ON DELETE CASCADE,
description VARCHAR(256) NOT NULL,
- kind int NOT NULL DEFAULT 0,
+ kind int NOT NULL REFERENCES wst_kinds(id) DEFAULT 0,
-- TODO: more meta infos
UNIQUE (river_id, description)
);
@@ -182,7 +200,7 @@
CREATE TABLE wst_columns (
id int PRIMARY KEY NOT NULL,
- wst_id int NOT NULL REFERENCES wsts(id),
+ wst_id int NOT NULL REFERENCES wsts(id) ON DELETE CASCADE,
name VARCHAR(256) NOT NULL,
description VARCHAR(256),
position int NOT NULL DEFAULT 0,
@@ -198,7 +216,7 @@
CREATE TABLE wst_column_values (
id int PRIMARY KEY NOT NULL,
- wst_column_id int NOT NULL REFERENCES wst_columns(id),
+ wst_column_id int NOT NULL REFERENCES wst_columns(id) ON DELETE CASCADE,
position NUMERIC NOT NULL,
w NUMERIC NOT NULL,
@@ -211,7 +229,7 @@
CREATE TABLE wst_q_ranges (
id int PRIMARY KEY NOT NULL,
- range_id int NOT NULL REFERENCES ranges(id),
+ range_id int NOT NULL REFERENCES ranges(id) ON DELETE CASCADE,
q NUMERIC NOT NULL
);
@@ -220,8 +238,8 @@
CREATE TABLE wst_column_q_ranges (
id int PRIMARY KEY NOT NULL,
- wst_column_id int NOT NULL REFERENCES wst_columns(id),
- wst_q_range_id int NOT NULL REFERENCES wst_q_ranges(id),
+ wst_column_id int NOT NULL REFERENCES wst_columns(id) ON DELETE CASCADE,
+ wst_q_range_id int NOT NULL REFERENCES wst_q_ranges(id) ON DELETE CASCADE,
UNIQUE (wst_column_id, wst_q_range_id)
);
@@ -277,7 +295,7 @@
CREATE TABLE cross_sections (
id int PRIMARY KEY NOT NULL,
- river_id int NOT NULL REFERENCES rivers(id),
+ river_id int NOT NULL REFERENCES rivers(id) ON DELETE CASCADE,
time_interval_id int REFERENCES time_intervals(id),
description VARCHAR(256)
);
@@ -287,7 +305,7 @@
CREATE TABLE cross_section_lines (
id int PRIMARY KEY NOT NULL,
km NUMERIC NOT NULL,
- cross_section_id int NOT NULL REFERENCES cross_sections(id),
+ cross_section_id int NOT NULL REFERENCES cross_sections(id) ON DELETE CASCADE,
UNIQUE (km, cross_section_id)
);
@@ -295,7 +313,7 @@
CREATE TABLE cross_section_points (
id int PRIMARY KEY NOT NULL,
- cross_section_line_id int NOT NULL REFERENCES cross_section_lines(id),
+ cross_section_line_id int NOT NULL REFERENCES cross_section_lines(id) ON DELETE CASCADE,
col_pos int NOT NULL,
x NUMERIC NOT NULL,
y NUMERIC NOT NULL,
@@ -314,7 +332,7 @@
CREATE TABLE hyks (
id int PRIMARY KEY NOT NULL,
- river_id int NOT NULL REFERENCES rivers(id),
+ river_id int NOT NULL REFERENCES rivers(id) ON DELETE CASCADE,
description VARCHAR(256) NOT NULL
);
@@ -322,7 +340,7 @@
CREATE TABLE hyk_entries (
id int PRIMARY KEY NOT NULL,
- hyk_id int NOT NULL REFERENCES hyks(id),
+ hyk_id int NOT NULL REFERENCES hyks(id) ON DELETE CASCADE,
km NUMERIC NOT NULL,
measure TIMESTAMP,
UNIQUE (hyk_id, km)
@@ -333,7 +351,7 @@
CREATE TABLE hyk_formations (
id int PRIMARY KEY NOT NULL,
formation_num int NOT NULL DEFAULT 0,
- hyk_entry_id int NOT NULL REFERENCES hyk_entries(id),
+ hyk_entry_id int NOT NULL REFERENCES hyk_entries(id) ON DELETE CASCADE,
top NUMERIC NOT NULL,
bottom NUMERIC NOT NULL,
distance_vl NUMERIC NOT NULL,
@@ -354,7 +372,7 @@
CREATE TABLE hyk_flow_zones (
id int PRIMARY KEY NOT NULL,
- formation_id int NOT NULL REFERENCES hyk_formations(id),
+ formation_id int NOT NULL REFERENCES hyk_formations(id) ON DELETE CASCADE,
type_id int NOT NULL REFERENCES hyk_flow_zone_types(id),
a NUMERIC NOT NULL,
b NUMERIC NOT NULL,
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/pom-oracle.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/pom-oracle.xml Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,142 @@
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+ <modelVersion>4.0.0</modelVersion>
+
+ <groupId>de.intevation.flys</groupId>
+ <artifactId>flys-backend</artifactId>
+ <version>1.0-SNAPSHOT</version>
+ <packaging>jar</packaging>
+
+ <name>flys-backend</name>
+ <url>http://maven.apache.org</url>
+
+ <properties>
+ <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
+ </properties>
+
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>org.codehaus.mojo</groupId>
+ <artifactId>hibernate3-maven-plugin</artifactId>
+ <version>2.2</version>
+ <!--
+ <configuration>
+ <componentProperties>
+ <propertyfile>src/main/config/hbm.properties</propertyfile>
+ </componentProperties>
+ </configuration>
+ -->
+ </plugin>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-compiler-plugin</artifactId>
+ <version>2.0.2</version>
+ <configuration>
+ <source>1.6</source>
+ <target>1.6</target>
+ </configuration>
+ </plugin>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-jar-plugin</artifactId>
+ <configuration>
+ <archive>
+ <manifest>
+ <mainClass>de.intevation.flys.importer.Importer</mainClass>
+ <packageName>de.intevation.flys.importer</packageName>
+ </manifest>
+ </archive>
+ </configuration>
+ </plugin>
+ <plugin>
+ <artifactId>maven-assembly-plugin</artifactId>
+ <configuration>
+ <archive>
+ <manifest>
+ <mainClass>de.intevation.flys.importer.Importer</mainClass>
+ </manifest>
+ </archive>
+ <descriptorRefs>
+ <descriptorRef>jar-with-dependencies</descriptorRef>
+ </descriptorRefs>
+ </configuration>
+ </plugin>
+ </plugins>
+ </build>
+
+ <dependencies>
+ <dependency>
+ <groupId>de.intevation.artifacts.common</groupId>
+ <artifactId>artifacts-common</artifactId>
+ <version>1.0-SNAPSHOT</version>
+ </dependency>
+ <dependency>
+ <groupId>junit</groupId>
+ <artifactId>junit</artifactId>
+ <version>3.8.1</version>
+ <scope>test</scope>
+ </dependency>
+ <dependency>
+ <groupId>net.sf.opencsv</groupId>
+ <artifactId>opencsv</artifactId>
+ <version>2.0</version>
+ </dependency>
+ <dependency>
+ <groupId>org.hibernate</groupId>
+ <artifactId>hibernate-core</artifactId>
+ <version>3.6.5.Final</version>
+ </dependency>
+ <dependency>
+ <groupId>org.hibernate</groupId>
+ <artifactId>hibernate-entitymanager</artifactId>
+ <version>3.6.5.Final</version>
+ </dependency>
+ <dependency>
+ <groupId>log4j</groupId>
+ <artifactId>log4j</artifactId>
+ <version>1.2.14</version>
+ </dependency>
+ <dependency>
+ <groupId>commons-dbcp</groupId>
+ <artifactId>commons-dbcp</artifactId>
+ <version>1.4</version>
+ </dependency>
+ <dependency>
+ <groupId>org.hibernatespatial</groupId>
+ <artifactId>hibernate-spatial-postgis</artifactId>
+ <version>1.1</version>
+ </dependency>
+ <dependency>
+ <groupId>org.hibernatespatial</groupId>
+ <artifactId>hibernate-spatial-oracle</artifactId>
+ <version>1.1</version>
+ </dependency>
+ <dependency>
+ <groupId>org.postgis</groupId>
+ <artifactId>postgis-jdbc</artifactId>
+ <version>1.3.3</version>
+ </dependency>
+ <dependency>
+ <groupId>ojdbc5.jar</groupId>
+ <artifactId>ojdbc5</artifactId>
+ <version>0</version>
+ </dependency>
+ </dependencies>
+
+ <repositories>
+ <repository>
+ <id>repository.jboss.org/nexus</id>
+ <name>JBoss Repository - Nexus</name>
+ <url>http://repository.jboss.org/nexus/content/groups/public/</url>
+ </repository>
+ <repository>
+ <id>OSGEO GeoTools repo</id>
+ <url>http://download.osgeo.org/webdav/geotools</url>
+ </repository>
+ <repository>
+ <id>Hibernate Spatial repo</id>
+ <url>http://www.hibernatespatial.org/repository</url>
+ </repository>
+ </repositories>
+</project>
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/pom.xml
--- a/flys-backend/pom.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/pom.xml Fri Mar 22 11:25:54 2013 +0100
@@ -37,6 +37,31 @@
<target>1.6</target>
</configuration>
</plugin>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-jar-plugin</artifactId>
+ <configuration>
+ <archive>
+ <manifest>
+ <mainClass>de.intevation.flys.importer.Importer</mainClass>
+ <packageName>de.intevation.flys.importer</packageName>
+ </manifest>
+ </archive>
+ </configuration>
+ </plugin>
+ <plugin>
+ <artifactId>maven-assembly-plugin</artifactId>
+ <configuration>
+ <archive>
+ <manifest>
+ <mainClass>de.intevation.flys.importer.Importer</mainClass>
+ </manifest>
+ </archive>
+ <descriptorRefs>
+ <descriptorRef>jar-with-dependencies</descriptorRef>
+ </descriptorRefs>
+ </configuration>
+ </plugin>
</plugins>
</build>
@@ -89,11 +114,6 @@
<version>1.1</version>
</dependency>
<dependency>
- <groupId>org.hibernatespatial</groupId>
- <artifactId>hibernate-spatial-oracle</artifactId>
- <version>1.1</version>
- </dependency>
- <dependency>
<groupId>org.postgis</groupId>
<artifactId>postgis-jdbc</artifactId>
<version>1.3.3</version>
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/backend/Credentials.java
--- a/flys-backend/src/main/java/de/intevation/flys/backend/Credentials.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/backend/Credentials.java Fri Mar 22 11:25:54 2013 +0100
@@ -7,6 +7,7 @@
protected String dialect;
protected String driver;
protected String url;
+ protected String connectionInitSqls;
protected Class [] classes;
public Credentials() {
@@ -18,14 +19,16 @@
String dialect,
String driver,
String url,
+ String connectionInitSqls,
Class [] classes
) {
- this.user = user;
- this.password = password;
- this.dialect = dialect;
- this.driver = driver;
- this.url = url;
- this.classes = classes;
+ this.user = user;
+ this.password = password;
+ this.dialect = dialect;
+ this.driver = driver;
+ this.url = url;
+ this.connectionInitSqls = connectionInitSqls;
+ this.classes = classes;
}
public String getUser() {
@@ -68,6 +71,14 @@
this.url = url;
}
+ public String getConnectionInitSqls() {
+ return connectionInitSqls;
+ }
+
+ public void setConnectionInitSqls(String connectionInitSqls) {
+ this.connectionInitSqls = connectionInitSqls;
+ }
+
public Class [] getClasses() {
return classes;
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/backend/FLYSCredentials.java
--- a/flys-backend/src/main/java/de/intevation/flys/backend/FLYSCredentials.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/backend/FLYSCredentials.java Fri Mar 22 11:25:54 2013 +0100
@@ -5,17 +5,19 @@
import de.intevation.flys.model.Annotation;
import de.intevation.flys.model.AnnotationType;
import de.intevation.flys.model.Attribute;
+import de.intevation.flys.model.AxisKind;
import de.intevation.flys.model.BedHeightEpoch;
import de.intevation.flys.model.BedHeightEpochValue;
import de.intevation.flys.model.BedHeightSingle;
import de.intevation.flys.model.BedHeightSingleValue;
import de.intevation.flys.model.BedHeightType;
import de.intevation.flys.model.Building;
-import de.intevation.flys.model.Catchment;
+import de.intevation.flys.model.BoundaryKind;
import de.intevation.flys.model.CrossSection;
import de.intevation.flys.model.CrossSectionLine;
import de.intevation.flys.model.CrossSectionPoint;
import de.intevation.flys.model.CrossSectionTrack;
+import de.intevation.flys.model.CrossSectionTrackKind;
import de.intevation.flys.model.DGM;
import de.intevation.flys.model.Depth;
import de.intevation.flys.model.DischargeTable;
@@ -23,9 +25,11 @@
import de.intevation.flys.model.DischargeZone;
import de.intevation.flys.model.Edge;
import de.intevation.flys.model.ElevationModel;
+import de.intevation.flys.model.FedState;
import de.intevation.flys.model.Fixpoint;
import de.intevation.flys.model.Floodmaps;
import de.intevation.flys.model.Floodplain;
+import de.intevation.flys.model.FloodplainKind;
import de.intevation.flys.model.FlowVelocityMeasurement;
import de.intevation.flys.model.FlowVelocityMeasurementValue;
import de.intevation.flys.model.FlowVelocityModel;
@@ -33,15 +37,16 @@
import de.intevation.flys.model.Gauge;
import de.intevation.flys.model.GaugeLocation;
import de.intevation.flys.model.GrainFraction;
+import de.intevation.flys.model.HWSKind;
+import de.intevation.flys.model.HWSLine;
+import de.intevation.flys.model.HWSPoint;
import de.intevation.flys.model.HYK;
import de.intevation.flys.model.HYKEntry;
import de.intevation.flys.model.HYKFlowZone;
import de.intevation.flys.model.HYKFlowZoneType;
import de.intevation.flys.model.HYKFormation;
-import de.intevation.flys.model.Hws;
import de.intevation.flys.model.HydrBoundary;
import de.intevation.flys.model.HydrBoundaryPoly;
-import de.intevation.flys.model.Line;
import de.intevation.flys.model.LocationSystem;
import de.intevation.flys.model.MainValue;
import de.intevation.flys.model.MainValueType;
@@ -56,18 +61,14 @@
import de.intevation.flys.model.RiverAxisKm;
import de.intevation.flys.model.SQRelation;
import de.intevation.flys.model.SQRelationValue;
+import de.intevation.flys.model.SectieKind;
+import de.intevation.flys.model.SobekKind;
import de.intevation.flys.model.SedimentDensity;
import de.intevation.flys.model.SedimentDensityValue;
import de.intevation.flys.model.SedimentYield;
import de.intevation.flys.model.SedimentYieldValue;
import de.intevation.flys.model.TimeInterval;
import de.intevation.flys.model.Unit;
-import de.intevation.flys.model.Waterlevel;
-import de.intevation.flys.model.WaterlevelDifference;
-import de.intevation.flys.model.WaterlevelDifferenceColumn;
-import de.intevation.flys.model.WaterlevelDifferenceValue;
-import de.intevation.flys.model.WaterlevelQRange;
-import de.intevation.flys.model.WaterlevelValue;
import de.intevation.flys.model.Wst;
import de.intevation.flys.model.WstColumn;
import de.intevation.flys.model.WstColumnQRange;
@@ -92,6 +93,9 @@
public static final String XPATH_URL =
"/artifact-database/backend-database/url/text()";
+ public static final String XPATH_CONNECTION_INIT_SQLS =
+ "/artifact-database/backend-database/connection-init-sqls/text()";
+
public static final String DEFAULT_USER =
System.getProperty("flys.backend.user", "flys");
@@ -113,21 +117,27 @@
"flys.backend.url",
"jdbc:postgresql://localhost:5432/flys");
+ public static final String DEFAULT_CONNECTION_INIT_SQLS =
+ System.getProperty(
+ "flys.backend.connection.init.sqls");
+
public static final Class [] CLASSES = {
Annotation.class,
AnnotationType.class,
Attribute.class,
+ AxisKind.class,
BedHeightEpoch.class,
BedHeightEpochValue.class,
BedHeightSingle.class,
BedHeightSingleValue.class,
BedHeightType.class,
Building.class,
- Catchment.class,
+ BoundaryKind.class,
CrossSection.class,
CrossSectionLine.class,
CrossSectionPoint.class,
CrossSectionTrack.class,
+ CrossSectionTrackKind.class,
Depth.class,
DGM.class,
DischargeTable.class,
@@ -135,8 +145,10 @@
DischargeZone.class,
Edge.class,
ElevationModel.class,
+ FedState.class,
Fixpoint.class,
Floodplain.class,
+ FloodplainKind.class,
Floodmaps.class,
FlowVelocityMeasurement.class,
FlowVelocityMeasurementValue.class,
@@ -145,7 +157,9 @@
Gauge.class,
GaugeLocation.class,
GrainFraction.class,
- Hws.class,
+ HWSKind.class,
+ HWSLine.class,
+ HWSPoint.class,
HydrBoundary.class,
HydrBoundaryPoly.class,
HYK.class,
@@ -153,7 +167,6 @@
HYKFormation.class,
HYKFlowZoneType.class,
HYKFlowZone.class,
- Line.class,
LocationSystem.class,
MainValueType.class,
MeasurementStation.class,
@@ -166,6 +179,8 @@
River.class,
RiverAxis.class,
RiverAxisKm.class,
+ SectieKind.class,
+ SobekKind.class,
SedimentDensity.class,
SedimentDensityValue.class,
SedimentYield.class,
@@ -174,12 +189,6 @@
SQRelationValue.class,
TimeInterval.class,
Unit.class,
- Waterlevel.class,
- WaterlevelDifference.class,
- WaterlevelDifferenceColumn.class,
- WaterlevelDifferenceValue.class,
- WaterlevelQRange.class,
- WaterlevelValue.class,
WstColumn.class,
WstColumnQRange.class,
WstColumnValue.class,
@@ -195,9 +204,11 @@
String password,
String dialect,
String driver,
- String url
+ String url,
+ String connectionInitSqls
) {
- super(user, password, dialect, driver, url, CLASSES);
+ super(
+ user, password, dialect, driver, url, connectionInitSqls, CLASSES);
}
private static Credentials instance;
@@ -214,9 +225,13 @@
Config.getStringXPath(XPATH_DRIVER, DEFAULT_DRIVER);
String url =
Config.getStringXPath(XPATH_URL, DEFAULT_URL);
+ String connectionInitSqls =
+ Config.getStringXPath(
+ XPATH_CONNECTION_INIT_SQLS,
+ DEFAULT_CONNECTION_INIT_SQLS);
instance = new FLYSCredentials(
- user, password, dialect, driver, url);
+ user, password, dialect, driver, url, connectionInitSqls);
}
return instance;
}
@@ -227,7 +242,8 @@
DEFAULT_PASSWORD,
DEFAULT_DIALECT,
DEFAULT_DRIVER,
- DEFAULT_URL);
+ DEFAULT_URL,
+ DEFAULT_CONNECTION_INIT_SQLS);
}
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/backend/SedDBCredentials.java
--- a/flys-backend/src/main/java/de/intevation/flys/backend/SedDBCredentials.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/backend/SedDBCredentials.java Fri Mar 22 11:25:54 2013 +0100
@@ -73,6 +73,9 @@
public static final String XPATH_URL =
"/artifact-database/seddb-database/url/text()";
+ public static final String XPATH_CONNECTION_INIT_SQLS =
+ "/artifact-database/seddb-database/connection-init-sqls/text()";
+
public static final String DEFAULT_USER =
System.getProperty("flys.seddb.user", "seddb");
@@ -94,6 +97,10 @@
"flys.seddb.url",
"jdbc:postgresql://localhost:5432/seddb");
+ public static final String DEFAULT_CONNECTION_INIT_SQLS =
+ System.getProperty(
+ "flys.seddb.connection.init.sqls");
+
public static final Class [] CLASSES = {
BezugspegelgewId.class,
Bezugspegelgew.class,
@@ -159,9 +166,11 @@
String password,
String dialect,
String driver,
- String url
+ String url,
+ String connectionInitSqls
) {
- super(user, password, dialect, driver, url, CLASSES);
+ super(
+ user, password, dialect, driver, url, connectionInitSqls, CLASSES);
}
public static synchronized Credentials getInstance() {
@@ -176,9 +185,13 @@
Config.getStringXPath(XPATH_DRIVER, DEFAULT_DRIVER);
String url =
Config.getStringXPath(XPATH_URL, DEFAULT_URL);
+ String connectionInitSqls =
+ Config.getStringXPath(
+ XPATH_CONNECTION_INIT_SQLS,
+ DEFAULT_CONNECTION_INIT_SQLS);
instance = new SedDBCredentials(
- user, password, dialect, driver, url);
+ user, password, dialect, driver, url, connectionInitSqls);
}
return instance;
}
@@ -189,7 +202,8 @@
DEFAULT_PASSWORD,
DEFAULT_DIALECT,
DEFAULT_DRIVER,
- DEFAULT_URL);
+ DEFAULT_URL,
+ DEFAULT_CONNECTION_INIT_SQLS);
}
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/backend/SessionFactoryProvider.java
--- a/flys-backend/src/main/java/de/intevation/flys/backend/SessionFactoryProvider.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/backend/SessionFactoryProvider.java Fri Mar 22 11:25:54 2013 +0100
@@ -137,6 +137,11 @@
props.setProperty(Environment.DRIVER, credentials.getDriver());
props.setProperty(Environment.URL, credentials.getUrl());
+ String connectionInitSqls = credentials.getConnectionInitSqls();
+ if (connectionInitSqls != null) {
+ props.setProperty("connectionInitSqls", connectionInitSqls);
+ }
+
cfg.mergeProperties(props);
return cfg;
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/backend/SpatialInfo.java
--- a/flys-backend/src/main/java/de/intevation/flys/backend/SpatialInfo.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/backend/SpatialInfo.java Fri Mar 22 11:25:54 2013 +0100
@@ -4,15 +4,16 @@
import org.apache.log4j.Logger;
+import org.hibernate.HibernateException;
import org.hibernate.Query;
import org.hibernate.Session;
import de.intevation.flys.model.Building;
import de.intevation.flys.model.CrossSectionTrack;
import de.intevation.flys.model.Fixpoint;
-import de.intevation.flys.model.Line;
import de.intevation.flys.model.River;
import de.intevation.flys.model.RiverAxis;
+import de.intevation.flys.model.HWSLine;
public class SpatialInfo {
@@ -42,7 +43,6 @@
logger.info("Spatial information of River '" + RIVERNAME + "'");
spatial.doRiverAxisInfo(river);
spatial.doCrossSectionTracksInfo(river);
- spatial.doLinesInfo(river);
spatial.doBuildingsInfo(river);
spatial.doFixpointsInfo(river);
}
@@ -85,13 +85,20 @@
protected void doRiverAxisInfo(River river) {
- List<RiverAxis> axis = RiverAxis.getRiverAxis(river.getName());
- if (axis != null && axis.size() > 0) {
- logger.debug("TODO: Compute length and boundary.");
+ try {
+ List<RiverAxis> axis = RiverAxis.getRiverAxis(river.getName());
+ if (axis != null && axis.size() > 0) {
+ logger.debug("TODO: Compute length and boundary.");
+ }
+ else {
+ logger.warn("River has no RiverAxis.");
+ }
}
- else {
- logger.warn("River has no RiverAxis.");
+ catch(HibernateException iae) {
+ logger.warn("No vaild river axis found for " + river.getName());
+ return;
}
+
}
@@ -112,23 +119,6 @@
}
- protected void doLinesInfo(River river) {
- Query query = session.createQuery(
- "from Line where river =:river");
- query.setParameter("river", river);
-
- List<Line> list = query.list();
-
- if (list == null || list.size() == 0) {
- logger.warn("No Lines for '" + river.getName() + "' found!");
- return;
- }
- else {
- logger.info("River contains " + list.size() + " Lines.");
- }
- }
-
-
protected void doBuildingsInfo(River river) {
Query query = session.createQuery(
"from Building where river =:river");
@@ -161,5 +151,26 @@
logger.info("River contains " + list.size() + " Fixpoints.");
}
}
+
+ @Deprecated
+ protected void doLinesInfo(River river) {
+ doHWSLinesInfo(river);
+ }
+
+ protected void doHWSLinesInfo(River river) {
+ Query query = session.createQuery(
+ "from hws_lines where river =:river");
+ query.setParameter("river", river);
+
+ List<HWSLine> list = query.list();
+
+ if (list == null || list.size() == 0) {
+ logger.warn("No Lines for '" + river.getName() + "' found!");
+ return;
+ }
+ else {
+ logger.info("River contains " + list.size() + " Lines.");
+ }
+ }
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf-8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportBedHeightSingle.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportBedHeightSingle.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/ImportBedHeightSingle.java Fri Mar 22 11:25:54 2013 +0100
@@ -22,7 +22,7 @@
{
private static Logger log = Logger.getLogger(ImportBedHeightSingle.class);
- protected int year;
+ protected Integer year;
protected int soundingWidth;
protected String evaluationBy;
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportBedHeightType.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportBedHeightType.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/ImportBedHeightType.java Fri Mar 22 11:25:54 2013 +0100
@@ -16,19 +16,22 @@
Logger.getLogger(ImportBedHeightType.class);
protected String name;
- protected String description;
protected BedHeightType peer;
-
- public ImportBedHeightType(String name, String description) {
- this.name = name;
- this.description = description;
+ public ImportBedHeightType(BedHeightType peer) {
+ this.peer = peer;
+ name = peer.getName();
}
+ public ImportBedHeightType(String name) {
+ this.name = name;
+ }
+
+
public void storeDependencies() {
- BedHeightType type = getPeer();
+ getPeer();
}
@@ -37,16 +40,14 @@
Session session = ImporterSession.getInstance().getDatabaseSession();
Query query = session.createQuery(
- "from BedHeightType where " +
- "name=:name and description=:description");
+ "from BedHeightType where name=:name and description=:description");
query.setParameter("name", name);
- query.setParameter("description", description);
List<BedHeightType> types = query.list();
if (types.isEmpty()) {
- peer = new BedHeightType(name, description);
+ peer = new BedHeightType(name);
session.save(peer);
}
else {
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportFlowVelocityModel.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportFlowVelocityModel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/ImportFlowVelocityModel.java Fri Mar 22 11:25:54 2013 +0100
@@ -85,15 +85,16 @@
DischargeZone zone = dischargeZone.getPeer(river);
Query query = session.createQuery("from FlowVelocityModel where "
- + " river=:river and " + " dischargeZone=:dischargeZone");
+ + " dischargeZone=:dischargeZone");
- query.setParameter("river", river);
+ //query.setParameter("river", river);
query.setParameter("dischargeZone", zone);
List<FlowVelocityModel> model = query.list();
if (model.isEmpty()) {
- peer = new FlowVelocityModel(river, zone);
+ //peer = new FlowVelocityModel(river, zone);
+ peer = new FlowVelocityModel(zone);
session.save(peer);
}
else {
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportRiver.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportRiver.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/ImportRiver.java Fri Mar 22 11:25:54 2013 +0100
@@ -130,6 +130,12 @@
protected List<ImportWst> floodProtection;
+ /** Wst-structures from waterlevel-csv files. */
+ protected List<ImportWst> waterlevels;
+
+ /** Wst-structures from waterlevel-difference-csv files. */
+ protected List<ImportWst> waterlevelDifferences;
+
protected List<ImportBedHeight> bedHeightSingles;
protected List<ImportBedHeight> bedHeightEpochs;
@@ -144,10 +150,6 @@
protected List<ImportSedimentYield> sedimentYields;
- protected List<ImportWaterlevel> waterlevels;
-
- protected List<ImportWaterlevelDifference> waterlevelDiffs;
-
protected List<ImportMeasurementStation> measurementStations;
protected List<ImportSQRelation> sqRelations;
@@ -194,7 +196,7 @@
addCrossSections(parser);
}
- } // ImportRiverCrossSectionParserCallback
+ } // ImportRiverCrossSectionParserCallback
public ImportRiver() {
@@ -204,14 +206,14 @@
fixations = new ArrayList<ImportWst>();
officialLines = new ArrayList<ImportWst>();
floodWater = new ArrayList<ImportWst>();
+ waterlevels = new ArrayList<ImportWst>();
+ waterlevelDifferences = new ArrayList<ImportWst>();
floodProtection = new ArrayList<ImportWst>();
sedimentDensities = new ArrayList<ImportSedimentDensity>();
morphologicalWidths = new ArrayList<ImportMorphWidth>();
flowVelocityModels = new ArrayList<ImportFlowVelocityModel>();
flowVelocityMeasurements = new ArrayList<ImportFlowVelocityMeasurement>();
sedimentYields = new ArrayList<ImportSedimentYield>();
- waterlevels = new ArrayList<ImportWaterlevel>();
- waterlevelDiffs = new ArrayList<ImportWaterlevelDifference>();
measurementStations = new ArrayList<ImportMeasurementStation>();
sqRelations = new ArrayList<ImportSQRelation>();
}
@@ -550,7 +552,7 @@
File[] files = wspDir.listFiles();
if (files == null) {
- log.warn("Cannot read directory '" + wspDir + "'");
+ log.warn("Cannot read directory for wl '" + wspDir + "'");
return;
}
@@ -562,9 +564,10 @@
// The parsed ImportWaterlevels are converted to
// 'fixation'-wsts now.
- for(ImportWst iw: parser.exportWsts()) {
- //iw.setDescription("CSV" + iw.getDescription());
- fixations.add(iw);
+ for(ImportWst iw: parser.getWaterlevels()) {
+ iw.setDescription("CSV/" + iw.getDescription());
+ iw.setKind(6);
+ waterlevels.add(iw);
}
}
@@ -624,7 +627,12 @@
parser.parse(file);
}
- waterlevelDiffs = parser.getDifferences();
+ // WaterlevelDifferences become Wsts now.
+ for(ImportWst iw: parser.getDifferences()) {
+ iw.setDescription("CSV/" + iw.getDescription());
+ iw.setKind(7);
+ waterlevelDifferences.add(iw);
+ }
}
@@ -1122,16 +1130,43 @@
}
public void storeFixations() {
- if (!Config.INSTANCE.skipFixations() || !Config.INSTANCE.skipWaterlevels()) {
- log.info("store fixation wsts and/or csvs");
+ if (!Config.INSTANCE.skipFixations()) {
+ log.info("store fixation wsts");
River river = getPeer();
- for (ImportWst wst: fixations) {
- log.debug("name: " + wst.getDescription());
- wst.storeDependencies(river);
+ for (ImportWst fWst: fixations) {
+ log.debug("Fixation name: " + fWst.getDescription());
+ fWst.storeDependencies(river);
}
}
}
+
+ /** Store wsts from waterlevel-csv files. */
+ public void storeWaterlevels() {
+ if (!Config.INSTANCE.skipWaterlevels())
+
+ log.info("store waterlevel wsts from csv");
+ River river = getPeer();
+ for (ImportWst wWst: waterlevels) {
+ log.debug("Waterlevel name: " + wWst.getDescription());
+ wWst.storeDependencies(river);
+ }
+ }
+
+
+ /** Store wsts from waterleveldifference-csv files. */
+ public void storeWaterlevelDifferences() {
+ if (!Config.INSTANCE.skipWaterlevelDifferences())
+
+ log.info("store waterleveldifferences wsts from csv");
+ River river = getPeer();
+ for (ImportWst dWst: waterlevelDifferences) {
+ log.debug("water.diff.: name " + dWst.getDescription());
+ dWst.storeDependencies(river);
+ }
+ }
+
+
public void storeExtraWsts() {
if (!Config.INSTANCE.skipExtraWsts()) {
log.info("store extra wsts");
@@ -1165,6 +1200,7 @@
}
}
+
public void storeFloodProtection() {
if (!Config.INSTANCE.skipFloodProtection()) {
log.info("store flood protection wsts");
@@ -1343,40 +1379,6 @@
}
- public void storeWaterlevels() {
- if (!Config.INSTANCE.skipWaterlevels()) {
- log.info("store waterlevels");
-
- River river = getPeer();
-
- for (ImportWaterlevel waterlevel: waterlevels) {
- waterlevel.storeDependencies(river);
- }
- }
- }
-
-
- public void storeWaterlevelDifferences() {
- if (!Config.INSTANCE.skipWaterlevelDifferences()) {
- log.info("store waterlevel differences");
-
- River river = getPeer();
-
- for (ImportWaterlevelDifference diff: waterlevelDiffs) {
- try {
- diff.storeDependencies(river);
- }
- catch (SQLException sqle) {
- log.error("Error while storing waterlevel diff.", sqle);
- }
- catch (ConstraintViolationException cve) {
- log.error("Error while storing waterlevel diff.", cve);
- }
- }
- }
- }
-
-
public void storeMeasurementStations() {
if (!Config.INSTANCE.skipMeasurementStations()) {
log.info("store measurement stations");
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevel.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevel.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,102 +0,0 @@
-package de.intevation.flys.importer;
-
-import java.util.ArrayList;
-import java.util.List;
-
-import org.apache.log4j.Logger;
-
-import org.hibernate.Session;
-import org.hibernate.Query;
-
-import de.intevation.flys.model.River;
-import de.intevation.flys.model.Unit;
-import de.intevation.flys.model.Waterlevel;
-
-
-public class ImportWaterlevel {
-
- private static final Logger log = Logger.getLogger(ImportWaterlevel.class);
-
- private ImportUnit unit;
-
- private String description;
-
- private List<ImportWaterlevelQRange> qRanges;
-
- private Waterlevel peer;
-
- public ImportWaterlevel(String description) {
- this.qRanges = new ArrayList<ImportWaterlevelQRange>();
-
- this.description = description;
- }
-
- public String getDescription() {
- return this.description;
- }
-
- public void setUnit(ImportUnit unit) {
- this.unit = unit;
- }
-
- public ImportUnit getUnit() {
- return this.unit;
- }
-
- public void addValue(ImportWaterlevelQRange qRange) {
- this.qRanges.add(qRange);
- }
-
- public List<ImportWaterlevelQRange> getQRanges() {
- return this.qRanges;
- }
-
- public void storeDependencies(River river) {
- log.info("store dependencies");
-
- Waterlevel peer = getPeer(river);
-
- if (peer != null) {
- int i = 0;
-
- for (ImportWaterlevelQRange qRange : qRanges) {
- qRange.storeDependencies(peer);
- i++;
- }
-
- log.info("stored " + i + " waterlevel q ranges");
- }
- }
-
- public Waterlevel getPeer(River river) {
- Unit u = unit != null ? unit.getPeer() : null;
- if (u == null) {
- log.warn("skip invalid waterlevel - no unit set!");
- return null;
- }
-
- if (peer == null) {
- Session session = ImporterSession.getInstance()
- .getDatabaseSession();
- Query query = session.createQuery("from Waterlevel where "
- + " river=:river and " + " unit=:unit and "
- + " description=:description");
-
- query.setParameter("river", river);
- query.setParameter("unit", u);
- query.setParameter("description", description);
-
- List<Waterlevel> wsts = query.list();
- if (wsts.isEmpty()) {
- peer = new Waterlevel(river, u, description);
- session.save(peer);
- }
- else {
- peer = wsts.get(0);
- }
- }
-
- return peer;
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevelDifference.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevelDifference.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,95 +0,0 @@
-package de.intevation.flys.importer;
-
-import java.sql.SQLException;
-import java.util.ArrayList;
-import java.util.List;
-
-import org.apache.log4j.Logger;
-
-import org.hibernate.Session;
-import org.hibernate.Query;
-import org.hibernate.exception.ConstraintViolationException;
-
-import de.intevation.flys.model.River;
-import de.intevation.flys.model.Unit;
-import de.intevation.flys.model.WaterlevelDifference;
-
-
-public class ImportWaterlevelDifference {
-
- private static final Logger log = Logger
- .getLogger(ImportWaterlevelDifference.class);
-
- private ImportUnit unit;
-
- private String description;
-
- private List<ImportWaterlevelDifferenceColumn> columns;
-
- private WaterlevelDifference peer;
-
- public ImportWaterlevelDifference(String description) {
- this.columns = new ArrayList<ImportWaterlevelDifferenceColumn>();
-
- this.description = description;
- }
-
- public void setUnit(ImportUnit unit) {
- this.unit = unit;
- }
-
- public void addValue(ImportWaterlevelDifferenceColumn column) {
- this.columns.add(column);
- }
-
- public void storeDependencies(River river) throws SQLException,
- ConstraintViolationException {
- log.info("store dependencies");
-
- WaterlevelDifference peer = getPeer(river);
-
- if (peer != null) {
- int i = 0;
-
- for (ImportWaterlevelDifferenceColumn column : columns) {
- column.storeDependencies(peer);
- i++;
- }
-
- log.info("stored " + i + " waterlevel difference columns");
- }
- }
-
- public WaterlevelDifference getPeer(River river) {
- Unit u = unit != null ? unit.getPeer() : null;
- if (u == null) {
- log.warn("IWD: skip invalid waterlevel difference - no unit set!");
- return null;
- }
-
- if (peer == null) {
- Session session = ImporterSession.getInstance()
- .getDatabaseSession();
- Query query = session
- .createQuery("from WaterlevelDifference where "
- + " river=:river and " + " unit=:unit and "
- + " description=:description");
-
- query.setParameter("river", river);
- query.setParameter("unit", u);
- query.setParameter("description", description);
-
- List<WaterlevelDifference> diffs = query.list();
- if (diffs.isEmpty()) {
- peer = new WaterlevelDifference(river, u, description);
- session.save(peer);
- }
- else {
- peer = diffs.get(0);
- }
- }
-
- return peer;
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevelDifferenceColumn.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevelDifferenceColumn.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,81 +0,0 @@
-package de.intevation.flys.importer;
-
-import java.util.ArrayList;
-import java.util.List;
-
-import org.apache.log4j.Logger;
-
-import org.hibernate.Session;
-import org.hibernate.Query;
-
-import de.intevation.flys.model.WaterlevelDifference;
-import de.intevation.flys.model.WaterlevelDifferenceColumn;
-
-
-public class ImportWaterlevelDifferenceColumn {
-
- private static final Logger log =
- Logger.getLogger(ImportWaterlevelDifferenceColumn.class);
-
-
- private String description;
-
- private List<ImportWaterlevelDifferenceValue> values;
-
- private WaterlevelDifferenceColumn peer;
-
-
- public ImportWaterlevelDifferenceColumn(String description) {
- this.values = new ArrayList<ImportWaterlevelDifferenceValue>();
-
- this.description = description;
- }
-
-
- public void addValue(ImportWaterlevelDifferenceValue value) {
- this.values.add(value);
- }
-
-
- public void storeDependencies(WaterlevelDifference difference) {
- log.info("store dependencies");
-
- WaterlevelDifferenceColumn peer = getPeer(difference);
-
- int i = 0;
-
- for (ImportWaterlevelDifferenceValue value: values) {
- value.storeDependencies(peer);
- i++;
- }
-
- log.info("stored " + i + " waterlevel difference values");
- }
-
-
- public WaterlevelDifferenceColumn getPeer(WaterlevelDifference diff) {
- if (peer == null) {
- Session session = ImporterSession.getInstance().getDatabaseSession();
- Query query = session.createQuery(
- "from WaterlevelDifferenceColumn where " +
- " difference=:difference and " +
- " description=:description"
- );
-
- query.setParameter("difference", diff);
- query.setParameter("description", description);
-
- List<WaterlevelDifferenceColumn> cols = query.list();
- if (cols.isEmpty()) {
- peer = new WaterlevelDifferenceColumn(diff, description);
- session.save(peer);
- }
- else {
- peer = cols.get(0);
- }
- }
-
- return peer;
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevelDifferenceValue.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevelDifferenceValue.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,64 +0,0 @@
-package de.intevation.flys.importer;
-
-import java.util.List;
-
-import org.apache.log4j.Logger;
-
-import org.hibernate.Session;
-import org.hibernate.Query;
-
-import de.intevation.flys.model.WaterlevelDifferenceColumn;
-import de.intevation.flys.model.WaterlevelDifferenceValue;
-
-
-public class ImportWaterlevelDifferenceValue {
-
- private static final Logger log =
- Logger.getLogger(ImportWaterlevelDifferenceValue.class);
-
-
- private Double station;
- private Double value;
-
- private WaterlevelDifferenceValue peer;
-
-
- public ImportWaterlevelDifferenceValue(Double station, Double value) {
- this.station = station;
- this.value = value;
- }
-
-
- public void storeDependencies(WaterlevelDifferenceColumn column) {
- getPeer(column);
- }
-
-
- public WaterlevelDifferenceValue getPeer(WaterlevelDifferenceColumn column) {
- if (peer == null) {
- Session session = ImporterSession.getInstance().getDatabaseSession();
- Query query = session.createQuery(
- "from WaterlevelDifferenceValue where " +
- " column=:column and " +
- " station=:station and " +
- " value=:value"
- );
-
- query.setParameter("column", column);
- query.setParameter("station", station);
- query.setParameter("value", value);
-
- List<WaterlevelDifferenceValue> values = query.list();
- if (values.isEmpty()) {
- peer = new WaterlevelDifferenceValue(column, station, value);
- session.save(peer);
- }
- else {
- peer = values.get(0);
- }
- }
-
- return peer;
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevelQRange.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevelQRange.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,86 +0,0 @@
-package de.intevation.flys.importer;
-
-import java.util.ArrayList;
-import java.util.List;
-
-import org.apache.log4j.Logger;
-
-import org.hibernate.Session;
-import org.hibernate.Query;
-
-import de.intevation.flys.model.Waterlevel;
-import de.intevation.flys.model.WaterlevelQRange;
-
-
-/** Has a Q and list of W,km values. */
-public class ImportWaterlevelQRange {
-
- private static final Logger log =
- Logger.getLogger(ImportWaterlevelQRange.class);
-
- private Double q;
-
- private List<ImportWaterlevelValue> values;
-
- private WaterlevelQRange peer;
-
-
- public ImportWaterlevelQRange(Double q) {
- this.values = new ArrayList<ImportWaterlevelValue>();
- this.q = q;
- }
-
- public void addValue(ImportWaterlevelValue value) {
- this.values.add(value);
- }
-
- public Double getQ() {
- return this.q;
- }
-
- public List<ImportWaterlevelValue> getValues() {
- return values;
- }
-
- public void storeDependencies(Waterlevel waterlevel) {
- log.info("store dependencies");
-
- WaterlevelQRange peer = getPeer(waterlevel);
-
- int i = 0;
-
- for (ImportWaterlevelValue value: values) {
- value.storeDependencies(peer);
- i++;
- }
-
- log.info("stored " + i + " waterlevel values");
- }
-
-
- public WaterlevelQRange getPeer(Waterlevel waterlevel) {
- if (peer == null) {
- Session session = ImporterSession.getInstance().getDatabaseSession();
- Query query = session.createQuery(
- "from WaterlevelQRange where " +
- " waterlevel=:waterlevel and " +
- " q=:q"
- );
-
- query.setParameter("waterlevel", waterlevel);
- query.setParameter("q", q);
-
- List<WaterlevelQRange> qRanges = query.list();
- if (qRanges.isEmpty()) {
- peer = new WaterlevelQRange(waterlevel, q);
- session.save(peer);
- }
- else {
- peer = qRanges.get(0);
- }
- }
-
- return peer;
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevelValue.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportWaterlevelValue.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,69 +0,0 @@
-package de.intevation.flys.importer;
-
-import java.util.List;
-
-import org.hibernate.Session;
-import org.hibernate.Query;
-
-import de.intevation.flys.model.WaterlevelQRange;
-import de.intevation.flys.model.WaterlevelValue;
-
-
-/** W and a station. */
-public class ImportWaterlevelValue {
-
- private Double station;
- private Double w;
-
- private WaterlevelValue peer;
-
-
- public ImportWaterlevelValue(Double station, Double w) {
- this.station = station;
- this.w = w;
- }
-
-
- public void storeDependencies(WaterlevelQRange qRange) {
- getPeer(qRange);
- }
-
-
- public Double getStation() {
- return this.station;
- }
-
-
- public Double getW() {
- return this.w;
- }
-
- public WaterlevelValue getPeer(WaterlevelQRange qRange) {
- if (peer == null) {
- Session session = ImporterSession.getInstance().getDatabaseSession();
- Query query = session.createQuery(
- "from WaterlevelValue where " +
- " qrange=:qrange and " +
- " station=:station and " +
- " w=:w"
- );
-
- query.setParameter("qrange", qRange);
- query.setParameter("station", station);
- query.setParameter("w", w);
-
- List<WaterlevelValue> values = query.list();
-
- if (values.isEmpty()) {
- peer = new WaterlevelValue(qRange, station, w);
- session.save(peer);
- }
- else {
- peer = values.get(0);
- }
- }
-
- return peer;
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportWst.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportWst.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/ImportWst.java Fri Mar 22 11:25:54 2013 +0100
@@ -53,6 +53,7 @@
this.description = description;
}
+ /** Create columns that can be accessed with getColumn. */
public void setNumberColumns(int numColumns) {
for (int i = 0; i < numColumns; ++i) {
columns.add(new ImportWstColumn(this, null, null, i));
@@ -67,6 +68,15 @@
return columns.get(index);
}
+ public List<ImportWstColumn> getColumns() {
+ return columns;
+ }
+
+ /** Adds a column. Assumes that columns wst is this instance. */
+ public void addColumn(ImportWstColumn column) {
+ columns.add(column);
+ }
+
public ImportUnit getUnit() {
return unit;
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/ImportWstColumn.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/ImportWstColumn.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/ImportWstColumn.java Fri Mar 22 11:25:54 2013 +0100
@@ -93,6 +93,13 @@
new ImportWstColumnQRange(this, columnQRange));
}
+
+ /** Get the Column Values stored in this column. */
+ public List<ImportWstColumnValue> getColumnValues() {
+ return columnValues;
+ }
+
+
public void storeDependencies(River river) {
log.info("store column '" + name + "'");
WstColumn column = getPeer(river);
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/Importer.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/Importer.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/Importer.java Fri Mar 22 11:25:54 2013 +0100
@@ -76,7 +76,7 @@
catch (HibernateException he) {
Throwable t = he.getCause();
while (t instanceof SQLException) {
- SQLException sqle = (SQLException)t;
+ SQLException sqle = (SQLException) t;
log.error("SQL exeception chain:", sqle);
t = sqle.getNextException();
}
@@ -141,7 +141,8 @@
infoGewParser.parse(gewFile);
}
catch (IOException ioe) {
- log.error("error while parsing gew: " + gew);
+ log.error("error while parsing gew: " + gew, ioe);
+ System.exit(1);
}
}
@@ -156,7 +157,8 @@
infoGewParser.parse(gewFile);
}
catch (IOException ioe) {
- log.error("error while parsing gew: " + gew);
+ log.error("error while parsing gew: " + gew, ioe);
+ System.exit(1);
}
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/parsers/AnnotationsParser.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/parsers/AnnotationsParser.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/parsers/AnnotationsParser.java Fri Mar 22 11:25:54 2013 +0100
@@ -156,7 +156,7 @@
attribute, position, range, edge, type);
if (!annotations.add(annotation)) {
- log.warn("ANN: duplicated annotation '" + parts[0] +
+ log.info("ANN: duplicated annotation '" + parts[0] +
"' in line " + in.getLineNumber());
}
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/parsers/AtFileParser.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/parsers/AtFileParser.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/parsers/AtFileParser.java Fri Mar 22 11:25:54 2013 +0100
@@ -20,13 +20,13 @@
import de.intevation.flys.importer.ImportTimeInterval;
+/** Parse *.at (Abflusstafeln?) files. */
public class AtFileParser {
public static final String ENCODING = "ISO-8859-1";
private static Logger logger = Logger.getLogger(AtFileParser.class);
-
// regular expression from hell to find out time range
public static final Pattern DATE_LINE = Pattern.compile(
"^\\*\\s*Abflu[^t]+tafel?\\s*([^\\d]+)" +
@@ -155,6 +155,7 @@
}
public static Date guessDate(String day, String month, String year) {
+ // TODO evaluate whether DateGuesser class can do that.
if (day == null && month == null && year == null) {
return null;
}
@@ -202,5 +203,20 @@
cal.setTimeInMillis(ms - ms%1000);
return cal.getTime();
}
+
+
+ /** Parse one or more files, (useful for debugging), */
+ public static void main(String [] args) {
+
+ AtFileParser parser = new AtFileParser();
+
+ try {
+ for (String arg: args) {
+ parser.parse(new File(arg));
+ }
+ } catch(Exception e) {
+ logger.error("Exception caught " + e);
+ }
+ }
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/parsers/BedHeightParser.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/parsers/BedHeightParser.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/parsers/BedHeightParser.java Fri Mar 22 11:25:54 2013 +0100
@@ -31,7 +31,7 @@
import de.intevation.flys.importer.ImportTimeInterval;
import de.intevation.flys.importer.ImportUnit;
import de.intevation.flys.model.BedHeightType;
-
+import de.intevation.flys.importer.ImporterSession;
public abstract class BedHeightParser {
@@ -203,14 +203,13 @@
if (m.matches()) {
String tmp = m.group(1);
-
- try {
- obj.setYear(Integer.valueOf(tmp));
- return true;
+ if (tmp.length() > 0) {
+ obj.setYear(Integer.parseInt(tmp));
}
- catch (NumberFormatException e) {
- log.warn("BHP: Error while parsing year!", e);
+ else {
+ log.warn("BHP: No year given.");
}
+ return true;
}
return false;
@@ -331,15 +330,15 @@
if (m.matches()) {
String tmp = m.group(1).replace(";", "");
- String name = BedHeightType.getBedHeightName(tmp);
+ BedHeightType bht = BedHeightType.fetchBedHeightTypeForType(
+ tmp, ImporterSession.getInstance().getDatabaseSession());
- if (name != null) {
- obj.setType(new ImportBedHeightType(name, tmp));
+ if (bht != null) {
+ obj.setType(new ImportBedHeightType(bht));
return true;
}
- else {
- log.warn("Unknown bed height type: '" + tmp + "'");
- }
+
+ log.warn("Unknown bed height type: '" + tmp + "'");
}
return false;
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/parsers/BundesWasserStrassenParser.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/parsers/BundesWasserStrassenParser.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/parsers/BundesWasserStrassenParser.java Fri Mar 22 11:25:54 2013 +0100
@@ -1,22 +1,9 @@
package de.intevation.flys.importer.parsers;
-import java.math.BigDecimal;
-
-import java.text.NumberFormat;
-import java.text.ParseException;
-
-import java.util.ArrayList;
import java.util.HashMap;
-import java.util.List;
-import java.util.regex.Matcher;
-import java.util.regex.Pattern;
import org.apache.log4j.Logger;
-import de.intevation.flys.importer.ImportMorphWidth;
-import de.intevation.flys.importer.ImportMorphWidthValue;
-import de.intevation.flys.importer.ImportUnit;
-
/** Parse CSV file that contains official numbers for rivers. */
public class BundesWasserStrassenParser extends LineParser {
@@ -49,11 +36,15 @@
@Override
protected void handleLine(int lineNum, String line) {
String[] vals = line.split(",");
+ // Try both "," and ";" as separator.
if (vals.length != 2) {
- logger.warn("Invalid bwastr-id line:\n" + line);
- return;
+ vals = line.split(";");
+ if (vals.length != 2) {
+ logger.warn("Invalid bwastr-id line:\n" + line);
+ return;
+ }
}
- try{
+ try {
String name = unwrap(vals[0].toLowerCase());
String numberStr = unwrap(vals[1]);
Long number = Long.valueOf(numberStr);
@@ -65,7 +56,7 @@
}
- /** Get river->official number mapping. */
+ /** Get river -> official number mapping. */
public HashMap<String,Long> getMap() {
return numberMap;
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/parsers/SedimentYieldParser.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/parsers/SedimentYieldParser.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/parsers/SedimentYieldParser.java Fri Mar 22 11:25:54 2013 +0100
@@ -33,28 +33,28 @@
public static final String FRAKTION_START = "Fraktion:";
public static final String FRACTION_COARSE_STR =
- "_Grobkorn.csv";
+ ".*Grobkorn.*";
public static final String FRACTION_FINE_MIDDLE_STR =
- "_Fein-Mittel-Kies.csv";
+ ".*Fein-Mittel-Kies.*";
public static final String FRACTION_SAND =
- "_Sand.csv";
+ ".*Sand.*";
public static final String FRACTION_SUSP_SAND =
- "_susp_Sand.csv";
+ ".*susp_Sand.*";
public static final String FRACTION_SUSP_SAND_BED =
- "_bettbild_Anteil_susp_Sand.csv";
+ ".*bettbild_Anteil_susp_Sand.*";
public static final String FRACTION_SUSP_SAND_BED_EPOCH =
- "_susp_Sand_bettbildAnteil.csv";
+ ".*susp_Sand_bettbildAnteil.*";
public static final String FRACTION_SUSPENDED_SEDIMENT =
- "_Schwebstoff.csv";
+ ".*Schwebstoff.*";
public static final String FRACTION_TOTAL =
- "_gesamt.csv";
+ ".*gesamt.*";
public static final Pattern TIMEINTERVAL_SINGLE =
@@ -357,35 +357,33 @@
}
}
- log.warn("SYP: Unknow grain fraction: '" + gfStr + "'");
+ log.warn("SYP: Unknown grain fraction: '" + gfStr + "'");
return null;
}
public static String getGrainFractionTypeName(String filename) {
- if (filename.endsWith(FRACTION_COARSE_STR)) {
- return GrainFraction.COARSE;
+ if (Pattern.matches(FRACTION_COARSE_STR, filename)) {
+ return GrainFraction.COARSE;
}
- else if (filename.endsWith(FRACTION_FINE_MIDDLE_STR)) {
+ else if (Pattern.matches(FRACTION_FINE_MIDDLE_STR, filename)) {
return GrainFraction.FINE_MIDDLE;
}
- else if (filename.endsWith(FRACTION_SAND) &&
- !filename.endsWith(FRACTION_SUSP_SAND)) {
+ else if (Pattern.matches(FRACTION_SUSP_SAND_BED, filename) ||
+ Pattern.matches(FRACTION_SUSP_SAND_BED_EPOCH, filename)) {
+ return GrainFraction.SUSP_SAND_BED;
+ }
+ else if (Pattern.matches(FRACTION_SUSP_SAND, filename)) {
+ return GrainFraction.SUSP_SAND;
+ }
+ else if (Pattern.matches(FRACTION_SAND, filename)) {
return GrainFraction.SAND;
}
- else if (filename.endsWith(FRACTION_SUSP_SAND) &&
- !filename.endsWith(FRACTION_SUSP_SAND_BED)) {
- return GrainFraction.SUSP_SAND;
- }
- else if (filename.endsWith(FRACTION_SUSP_SAND_BED) ||
- filename.endsWith(FRACTION_SUSP_SAND_BED_EPOCH)) {
- return GrainFraction.SUSP_SAND_BED;
- }
- else if (filename.endsWith(FRACTION_SUSPENDED_SEDIMENT)) {
+ else if (Pattern.matches(FRACTION_SUSPENDED_SEDIMENT, filename)) {
return GrainFraction.SUSPENDED_SEDIMENT;
}
- else if (filename.endsWith(FRACTION_TOTAL)) {
+ else if (Pattern.matches(FRACTION_TOTAL, filename)) {
return GrainFraction.TOTAL;
}
else {
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/parsers/StaFileParser.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/parsers/StaFileParser.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/parsers/StaFileParser.java Fri Mar 22 11:25:54 2013 +0100
@@ -30,8 +30,8 @@
public static final String TYPES =
System.getProperty("flys.backend.main.value.types", "QWTD");
- public static final boolean PARSE_GAUGE_NUMBERS =
- Boolean.getBoolean("flys.backend.sta.parse.gauge.numbers");
+ public static final boolean NOT_PARSE_GAUGE_NUMBERS =
+ Boolean.getBoolean("flys.backend.sta.not.parse.gauge.numbers");
public static final Pattern QWTD_ =
Pattern.compile("\\s*([^\\s]+)\\s+([^\\s]+)\\s+([" +
@@ -68,7 +68,7 @@
Long gaugeNumber = null;
- if (PARSE_GAUGE_NUMBERS) {
+ if (!NOT_PARSE_GAUGE_NUMBERS) {
String gaugeNumberString = line.substring(0, 16).trim();
try {
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/parsers/WaterlevelDifferencesParser.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/parsers/WaterlevelDifferencesParser.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/parsers/WaterlevelDifferencesParser.java Fri Mar 22 11:25:54 2013 +0100
@@ -2,6 +2,7 @@
import java.io.File;
import java.io.IOException;
+import java.math.BigDecimal;
import java.text.NumberFormat;
import java.text.ParseException;
import java.util.ArrayList;
@@ -12,11 +13,14 @@
import org.apache.log4j.Logger;
import de.intevation.flys.importer.ImportUnit;
-import de.intevation.flys.importer.ImportWaterlevelDifference;
-import de.intevation.flys.importer.ImportWaterlevelDifferenceColumn;
-import de.intevation.flys.importer.ImportWaterlevelDifferenceValue;
+import de.intevation.flys.importer.ImportWst;
+import de.intevation.flys.importer.ImportWstColumn;
+
+/**
+ * Parse WaterlevelDifferences CSV file.
+ */
public class WaterlevelDifferencesParser extends LineParser {
private static final Logger log =
@@ -28,32 +32,40 @@
public static final Pattern META_UNIT =
Pattern.compile("^Einheit: \\[(.*)\\].*");
+ /** List of parsed differences as ImportWst s. */
+ private List<ImportWst> differences;
- private List<ImportWaterlevelDifference> differences;
+ private ImportWstColumn[] columns;
- private ImportWaterlevelDifferenceColumn[] columns;
-
- private ImportWaterlevelDifference current;
+ /** The currently processed dataset. */
+ private ImportWst current;
public WaterlevelDifferencesParser() {
- differences = new ArrayList<ImportWaterlevelDifference>();
+ differences = new ArrayList<ImportWst>();
}
- public List<ImportWaterlevelDifference> getDifferences() {
+ /** Get the differences as wst parsed so far. */
+ public List<ImportWst> getDifferences() {
return differences;
}
+ /**
+ * Parse a csv waterleveldifferenceparser and create a ImportWst object
+ * from it.
+ */
@Override
public void parse(File file) throws IOException {
- current = new ImportWaterlevelDifference(file.getName());
+ current = new ImportWst(file.getName());
+ current.setKind(7);
super.parse(file);
}
+ /** No rewind implemented. */
@Override
protected void reset() {
}
@@ -62,8 +74,10 @@
@Override
protected void finish() {
if (columns != null && current != null) {
- for (ImportWaterlevelDifferenceColumn col: columns) {
- current.addValue(col);
+ // TODO figure out if its needed, as the columns
+ // are registered at their construction time.
+ for (ImportWstColumn col: columns) {
+ // TODO place a current.addColumn(col); here?
}
differences.add(current);
@@ -73,6 +87,7 @@
columns = null;
}
+
@Override
protected void handleLine(int lineNum, String line) {
if (line.startsWith(START_META_CHAR)) {
@@ -130,13 +145,15 @@
private void initColumns(String[] cols) {
- columns = new ImportWaterlevelDifferenceColumn[cols.length];
+ current.setNumberColumns(cols.length);
+ columns = current.getColumns().toArray(new ImportWstColumn[cols.length]);
for (int i = 0; i < cols.length; i++) {
String name = cols[i].replace("\"", "");
log.debug("Create new column '" + name + "'");
- columns[i] = new ImportWaterlevelDifferenceColumn(name);
+ current.getColumn(i).setName(name);
+ current.getColumn(i).setDescription(name);
}
}
@@ -145,7 +162,7 @@
String[] cols = line.split(SEPERATOR_CHAR);
if (cols == null || cols.length < 2) {
- log.warn("skip invalid waterlevel line: '" + line + "'");
+ log.warn("skip invalid waterlevel-diff line: '" + line + "'");
return;
}
@@ -163,10 +180,9 @@
String value = cols[idx];
try {
- columns[i].addValue(new ImportWaterlevelDifferenceValue(
- station,
- nf.parse(value).doubleValue()
- ));
+ columns[i].addColumnValue(
+ new BigDecimal(station),
+ new BigDecimal(nf.parse(value).doubleValue()));
}
catch (ParseException pe) {
log.warn("Error while parsing value: '" + value + "'");
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/importer/parsers/WaterlevelParser.java
--- a/flys-backend/src/main/java/de/intevation/flys/importer/parsers/WaterlevelParser.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/importer/parsers/WaterlevelParser.java Fri Mar 22 11:25:54 2013 +0100
@@ -13,13 +13,11 @@
import org.apache.log4j.Logger;
import de.intevation.flys.importer.ImportUnit;
-import de.intevation.flys.importer.ImportWaterlevel;
-import de.intevation.flys.importer.ImportWaterlevelQRange;
-import de.intevation.flys.importer.ImportWaterlevelValue;
-import de.intevation.flys.importer.ImportWstColumn;
import de.intevation.flys.importer.ImportRange;
import de.intevation.flys.importer.ImportWst;
+import de.intevation.flys.importer.ImportWstColumn;
+import de.intevation.flys.importer.ImportWstColumnValue;
import de.intevation.flys.importer.ImportWstQRange;
@@ -43,71 +41,28 @@
public static final Pattern META_UNIT =
Pattern.compile("^Einheit: \\[(.*)\\].*");
- private List<ImportWaterlevel> waterlevels;
+ private List<ImportWst> waterlevels;
- private ImportWaterlevel current;
+ private ImportWst current;
- private ImportWaterlevelQRange currentQ;
+ /** The Waterlevel-Wst s will always have but one column. */
+ private ImportWstColumn column;
+
+ /** The current (incomplete) Q Range. */
+ private ImportWstQRange currentQRange;
+
+ /** The current (incomplete) km range for Q Range. */
+ private ImportRange currentRange;
private String currentDescription;
public WaterlevelParser() {
- waterlevels = new ArrayList<ImportWaterlevel>();
+ waterlevels = new ArrayList<ImportWst>();
}
- /**
- * Create ImportWst objects from ImportWaterlevel
- * objects.
- */
- public List<ImportWst> exportWsts() {
- List<ImportWst> wsts = new ArrayList<ImportWst>();
- for(ImportWaterlevel waterlevel: getWaterlevels()) {
- String description = waterlevel.getDescription();
- ImportWst wst = new ImportWst();
- wsts.add(wst);
- wst.setDescription(description);
- // Fixation kind.
- wst.setKind(2);
- wst.setUnit(waterlevel.getUnit());
-
- // Fake WST has but 1 column.
- wst.setNumberColumns(1);
- ImportWstColumn column = wst.getColumn(0);
- column.setDescription(description);
- column.setName(description);
- column.setPosition(0);
-
- // Build Q Range.
- List<ImportWaterlevelQRange> qRanges = waterlevel.getQRanges();
- for(ImportWaterlevelQRange range: qRanges) {
- List<ImportWaterlevelValue> values = range.getValues();
- if (values.size() < 2) {
- log.warn ("Not enough values to build valid QRange");
- continue;
- }
- ImportRange iRange = new ImportRange(
- BigDecimal.valueOf(values.get(0).getStation()),
- BigDecimal.valueOf(values.get(values.size() -1).getStation()));
- column.addColumnQRange(
- new ImportWstQRange(iRange, BigDecimal.valueOf(range.getQ())));
- }
-
- // The other W/KM values.
- for(ImportWaterlevelQRange range: qRanges) {
- for(ImportWaterlevelValue value: range.getValues()) {
- column.addColumnValue(BigDecimal.valueOf(value.getStation()),
- BigDecimal.valueOf(value.getW()));
- }
- }
- // TODO Maybe set a timeinterval.
- }
- return wsts;
- }
-
-
- public List<ImportWaterlevel> getWaterlevels() {
+ public List<ImportWst> getWaterlevels() {
return waterlevels;
}
@@ -122,16 +77,25 @@
@Override
protected void reset() {
- currentQ = null;
- current = new ImportWaterlevel(currentDescription);
+ currentQRange = null;
+ current = new ImportWst(currentDescription);
+ current.setNumberColumns(1);
+ column = current.getColumn(0);
+ column.setName(currentDescription);
+ column.setDescription(currentDescription);
+ current.setKind(6);
}
@Override
protected void finish() {
if (current != null) {
- if (currentQ != null) {
- current.addValue(currentQ);
+ if (currentQRange != null) {
+ List<ImportWstColumnValue> cValues = column.getColumnValues();
+ // Set end of range to last station.
+ currentRange.setB(cValues.get(cValues.size() -1).getPosition());
+ currentQRange.setRange(currentRange);
+ column.addColumnQRange(currentQRange);
}
waterlevels.add(current);
@@ -172,23 +136,21 @@
if (m.matches()) {
String unitStr = m.group(1);
String valueStr = m.group(2);
+ try {
+ if (currentQRange != null) {
+ // Finish off the last one.
+ List<ImportWstColumnValue> cValues = column.getColumnValues();
+ // Set end of range to last station.
+ currentRange.setB(cValues.get(cValues.size() -1).getPosition());
+ currentQRange.setRange(currentRange);
+ column.addColumnQRange(currentQRange);
+ }
+ currentQRange = new ImportWstQRange(null,
+ new BigDecimal(nf.parse(valueStr).doubleValue()));
+ currentRange = new ImportRange();
- if (currentQ != null) {
- if (current != null) {
- current.addValue(currentQ);
- }
- else {
- // this should never happen
- log.warn("Try to add Q range without waterlevel!");
- }
- }
-
- try {
log.debug("Found new Q range: Q=" + valueStr);
- currentQ = new ImportWaterlevelQRange(
- nf.parse(valueStr).doubleValue());
-
return true;
}
catch (ParseException pe) {
@@ -209,10 +171,17 @@
}
try {
+ // Store the value and remember the position for QRange, if needed.
Double station = nf.parse(cols[0]).doubleValue();
Double value = nf.parse(cols[1]).doubleValue();
- currentQ.addValue(new ImportWaterlevelValue(station, value));
+ BigDecimal stationBD = new BigDecimal(station);
+
+ column.addColumnValue(stationBD, new BigDecimal(value));
+
+ if (currentRange.getA() == null) {
+ currentRange.setA(stationBD);
+ }
}
catch (ParseException pe) {
log.warn("Error while parsing number values: '" + line + "'");
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/AxisKind.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/src/main/java/de/intevation/flys/model/AxisKind.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,44 @@
+package de.intevation.flys.model;
+
+import java.io.Serializable;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.Id;
+import javax.persistence.Table;
+
+ at Entity
+ at Table(name = "axis_kinds")
+public class AxisKind implements Serializable {
+ private Integer id;
+ private String name;
+
+ @Id
+ @Column(name = "id")
+ public Integer getId() {
+ return id;
+ }
+
+ public void setId(Integer id) {
+ this.id = id;
+ }
+
+ /**
+ * Get name.
+ *
+ * @return The display Name of the kind as String.
+ */
+ @Column(name = "name")
+ public String getName() {
+ return name;
+ }
+
+ /**
+ * Set name.
+ *
+ * @param name the value to set.
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/BedHeightType.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/BedHeightType.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/model/BedHeightType.java Fri Mar 22 11:25:54 2013 +0100
@@ -2,6 +2,8 @@
import java.io.Serializable;
+import java.util.List;
+
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.Table;
@@ -12,6 +14,10 @@
import org.apache.log4j.Logger;
+import org.hibernate.Query;
+import org.hibernate.Session;
+
+import de.intevation.flys.backend.SessionHolder;
@Entity
@Table(name = "bed_height_type")
@@ -22,15 +28,13 @@
private Integer id;
private String name;
- private String description;
public BedHeightType() {
}
- public BedHeightType(String name, String description) {
- this.name = name;
- this.description = description;
+ public BedHeightType(String name) {
+ this.name = name;
}
@Id
@@ -59,36 +63,24 @@
this.name = name;
}
- @Column(name = "description")
- public String getDescription() {
- return description;
+ public static BedHeightType fetchBedHeightTypeForType(String type) {
+ return fetchBedHeightTypeForType(type, null);
}
- public void setDescription(String description) {
- this.description = description;
- }
+ public static BedHeightType fetchBedHeightTypeForType(String name, Session session) {
+ if (session == null) {
+ session = SessionHolder.HOLDER.get();
+ }
- public static String getBedHeightName(String description) {
- if (description.equals("Flächenpeilung")) {
- return "FP";
- }
- else if ("Querprofile".equals(description)) {
- return "QP";
- }
- else if ("Querprofil".equals(description)) {
- return "QP";
- }
- else if ("TIN".equals(description)) {
- return "TIN";
- }
- else if ("Flächen- u. Querprofilpeilungen".equals(description)) {
- return "FP-QP";
- }
- else {
- log.warn("Unknown bed height type: " + description);
- return null;
- }
+ Query query = session.createQuery(
+ "from BedHeightType where name=:name");
+
+ query.setParameter("name", name);
+
+ List<Object> results = query.list();
+
+ return results.isEmpty() ? null : (BedHeightType)results.get(0);
}
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/BoundaryKind.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/src/main/java/de/intevation/flys/model/BoundaryKind.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,45 @@
+package de.intevation.flys.model;
+
+import java.io.Serializable;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.Id;
+import javax.persistence.Table;
+
+ at Entity
+ at Table(name = "boundary_kinds")
+public class BoundaryKind implements Serializable {
+
+ private Integer id;
+ private String name;
+
+ @Id
+ @Column(name = "id")
+ public Integer getId() {
+ return id;
+ }
+
+ public void setId(Integer id) {
+ this.id = id;
+ }
+
+ /**
+ * Get name.
+ *
+ * @return name of the kind of boundary as String.
+ */
+ @Column(name = "name")
+ public String getName() {
+ return name;
+ }
+
+ /**
+ * Set name.
+ *
+ * @param name the value to set.
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/Catchment.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/Catchment.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,107 +0,0 @@
-package de.intevation.flys.model;
-
-import java.io.Serializable;
-import java.math.BigDecimal;
-import java.util.List;
-
-import javax.persistence.Column;
-import javax.persistence.Entity;
-import javax.persistence.Id;
-import javax.persistence.JoinColumn;
-import javax.persistence.OneToOne;
-import javax.persistence.Table;
-
-import org.hibernate.Session;
-import org.hibernate.Query;
-import org.hibernate.annotations.Type;
-
-import com.vividsolutions.jts.geom.Geometry;
-
-import de.intevation.flys.backend.SessionHolder;
-
-
- at Entity
- at Table(name = "catchment")
-public class Catchment
-implements Serializable
-{
- private Integer id;
- private BigDecimal area;
- private String name;
- private River river;
- private Geometry geom;
-
- public Catchment() {
- }
-
-
- @Id
- @Column(name = "id")
- public Integer getId() {
- return id;
- }
-
-
- public void setId(Integer id) {
- this.id = id;
- }
-
-
- @OneToOne
- @JoinColumn(name = "river_id")
- public River getRiver() {
- return river;
- }
-
-
- public void setRiver(River river) {
- this.river = river;
- }
-
-
- @Column(name = "name")
- public String getName() {
- return name;
- }
-
-
- public void setName(String name) {
- this.name = name;
- }
-
-
- @Column(name = "area")
- public BigDecimal getArea() {
- return area;
- }
-
-
- public void setArea(BigDecimal area) {
- this.area = area;
- }
-
-
- @Column(name = "geom")
- @Type(type = "org.hibernatespatial.GeometryUserType")
- public Geometry getGeom() {
- return geom;
- }
-
-
- public void setGeom(Geometry geom) {
- this.geom = geom;
- }
-
-
- public static List<Catchment> getCatchments(int riverId, String name) {
- Session session = SessionHolder.HOLDER.get();
-
- Query query = session.createQuery(
- "from Catchment where river.id =:river_id AND name=:name");
- query.setParameter("river_id", riverId);
- query.setParameter("name", name);
-
- return query.list();
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/CrossSectionTrack.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/CrossSectionTrack.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/model/CrossSectionTrack.java Fri Mar 22 11:25:54 2013 +0100
@@ -152,6 +152,7 @@
Query query = session.createQuery(
"from CrossSectionTrack where river.name =:river " +
+ "and kind_id = 1 " +
"order by abs( km - :mykm)");
query.setParameter("river", river);
query.setParameter("mykm", new BigDecimal(km));
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/CrossSectionTrackKind.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/src/main/java/de/intevation/flys/model/CrossSectionTrackKind.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,45 @@
+package de.intevation.flys.model;
+
+import java.io.Serializable;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.Id;
+import javax.persistence.Table;
+
+ at Entity
+ at Table(name = "cross_section_track_kinds")
+public class CrossSectionTrackKind implements Serializable {
+ private Integer id;
+ private String name;
+
+ @Id
+ @Column(name = "id")
+ public Integer getId() {
+ return id;
+ }
+
+ public void setId(Integer id) {
+ this.id = id;
+ }
+
+ /**
+ * Get name.
+ *
+ * @return The name of the Cross section kind as String.
+ */
+ @Column(name = "name")
+ public String getName() {
+ return name;
+ }
+
+ /**
+ * Set name.
+ *
+ * @param name the value to set.
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+}
+
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/DGM.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/DGM.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/model/DGM.java Fri Mar 22 11:25:54 2013 +0100
@@ -6,10 +6,13 @@
import javax.persistence.Column;
import javax.persistence.Entity;
+import javax.persistence.GeneratedValue;
+import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.JoinColumn;
import javax.persistence.OneToOne;
import javax.persistence.Table;
+import javax.persistence.SequenceGenerator;
import org.hibernate.Session;
import org.hibernate.Query;
@@ -21,14 +24,15 @@
@Table(name = "dem")
public class DGM implements Serializable {
- private Integer id;
+ private Integer id;
+ private Integer srid;
- private River river;
+ private River river;
- private BigDecimal lower;
- private BigDecimal upper;
+ private Range range;
+ private TimeInterval time_interval;
- private String path;
+ private String path;
public DGM() {
@@ -40,6 +44,13 @@
}
@Id
+ @SequenceGenerator(
+ name = "SEQUENCE_DEM_ID_SEQ",
+ sequenceName = "DEM_ID_SEQ",
+ allocationSize = 1)
+ @GeneratedValue(
+ strategy = GenerationType.SEQUENCE,
+ generator = "SEQUENCE_DEM_ID_SEQ")
@Column(name = "id")
public Integer getId() {
return id;
@@ -55,24 +66,6 @@
return river;
}
- public void setLower(BigDecimal lower) {
- this.lower = lower;
- }
-
- @Column(name = "lower")
- public BigDecimal getLower() {
- return lower;
- }
-
- public void setUpper(BigDecimal upper) {
- this.upper = upper;
- }
-
- @Column(name = "upper")
- public BigDecimal getUpper() {
- return upper;
- }
-
public void setPath(String path) {
this.path = path;
}
@@ -82,6 +75,14 @@
return path;
}
+ public void setSrid(int srid) {
+ this.srid = srid;
+ }
+
+ @Column(name = "srid")
+ public int getSrid() {
+ return srid;
+ }
public static DGM getDGM(int id) {
Session session = SessionHolder.HOLDER.get();
@@ -101,8 +102,8 @@
Query query = session.createQuery(
"from DGM where river.name =:river and " +
- "lower <=:lower and upper >=:lower and " +
- "lower <=:upper and upper >=:upper");
+ "range.a <=:lower and range.b >=:lower and " +
+ "range.a <=:upper and range.b >=:upper");
query.setParameter("river", river);
query.setParameter("lower", new BigDecimal(lower));
query.setParameter("upper", new BigDecimal(upper));
@@ -111,5 +112,27 @@
return result.isEmpty() ? null : result.get(0);
}
+
+ @OneToOne
+ @JoinColumn(name = "range_id")
+ public Range getRange() {
+ return range;
+ }
+
+ public void setRange(Range range) {
+ this.range = range;
+ }
+
+ @OneToOne
+ @JoinColumn(name = "time_interval_id")
+ public TimeInterval getTimeInterval() {
+ return time_interval;
+ }
+
+ public void setTimeInterval(TimeInterval time_interval) {
+ this.time_interval = time_interval;
+ }
+
+
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/FedState.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/src/main/java/de/intevation/flys/model/FedState.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,45 @@
+package de.intevation.flys.model;
+
+import java.io.Serializable;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.Id;
+import javax.persistence.Table;
+
+ at Entity
+ at Table(name = "fed_states")
+public class FedState implements Serializable {
+
+ private Integer id;
+ private String name;
+
+ @Id
+ @Column(name = "id")
+ public Integer getId() {
+ return id;
+ }
+
+ public void setId(Integer id) {
+ this.id = id;
+ }
+
+ /**
+ * Get name.
+ *
+ * @return name of the Federal State as String.
+ */
+ @Column(name = "name")
+ public String getName() {
+ return name;
+ }
+
+ /**
+ * Set name.
+ *
+ * @param name the value to set.
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/Floodplain.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/Floodplain.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/model/Floodplain.java Fri Mar 22 11:25:54 2013 +0100
@@ -24,11 +24,13 @@
public class Floodplain
implements Serializable
{
- private Integer id;
+ private Integer id;
- private River river;
+ private FloodplainKind kind;
- private Polygon geom;
+ private River river;
+
+ private Polygon geom;
public Floodplain() {
@@ -55,6 +57,16 @@
this.river = river;
}
+ @OneToOne
+ @JoinColumn(name = "kind_id")
+ public FloodplainKind getKind() {
+ return kind;
+ }
+
+ public void setKind(FloodplainKind kind) {
+ this.kind = kind;
+ }
+
@Column(name = "geom")
@Type(type = "org.hibernatespatial.GeometryUserType")
public Polygon getGeom() {
@@ -69,8 +81,10 @@
public static Floodplain getFloodplain(String river) {
Session session = SessionHolder.HOLDER.get();
+ // kind_id 0 -> Offical
+ // kind_id 1 -> Misc.
Query query = session.createQuery(
- "from Floodplain where river.name =:river");
+ "from Floodplain where river.name =:river and kind_id=1");
query.setParameter("river", river);
List<Floodplain> result = query.list();
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/FloodplainKind.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/src/main/java/de/intevation/flys/model/FloodplainKind.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,45 @@
+package de.intevation.flys.model;
+
+import java.io.Serializable;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.Id;
+import javax.persistence.Table;
+
+ at Entity
+ at Table(name = "floodplain_kinds")
+public class FloodplainKind implements Serializable {
+ private Integer id;
+ private String name;
+
+ @Id
+ @Column(name = "id")
+ public Integer getId() {
+ return id;
+ }
+
+ public void setId(Integer id) {
+ this.id = id;
+ }
+
+ /**
+ * Get name.
+ *
+ * @return The name of the Floodplain Kind as String.
+ */
+ @Column(name = "name")
+ public String getName() {
+ return name;
+ }
+
+ /**
+ * Set name.
+ *
+ * @param name the value to set.
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+}
+
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/FlowVelocityModel.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/FlowVelocityModel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/model/FlowVelocityModel.java Fri Mar 22 11:25:54 2013 +0100
@@ -1,24 +1,25 @@
package de.intevation.flys.model;
+import de.intevation.flys.backend.SessionHolder;
+
import java.io.Serializable;
+
import java.util.List;
+import javax.persistence.Column;
import javax.persistence.Entity;
+import javax.persistence.GeneratedValue;
+import javax.persistence.GenerationType;
import javax.persistence.Id;
-import javax.persistence.Table;
-import javax.persistence.GeneratedValue;
-import javax.persistence.Column;
-import javax.persistence.SequenceGenerator;
-import javax.persistence.GenerationType;
import javax.persistence.JoinColumn;
import javax.persistence.OneToOne;
-
-import org.hibernate.Session;
-import org.hibernate.Query;
+import javax.persistence.SequenceGenerator;
+import javax.persistence.Table;
import org.apache.log4j.Logger;
-import de.intevation.flys.backend.SessionHolder;
+import org.hibernate.Query;
+import org.hibernate.Session;
@Entity
@@ -31,12 +32,8 @@
private Integer id;
- private River river;
-
private DischargeZone dischargeZone;
- private List<FlowVelocityModelValue> values;
-
private String description;
@@ -44,17 +41,15 @@
}
- public FlowVelocityModel(River river, DischargeZone dischargeZone) {
- this(river, dischargeZone, null);
+ public FlowVelocityModel(DischargeZone dischargeZone) {
+ this(dischargeZone, null);
}
public FlowVelocityModel(
- River river,
DischargeZone dischargeZone,
String description
) {
- this.river = river;
this.dischargeZone = dischargeZone;
this.description = description;
}
@@ -77,16 +72,6 @@
}
@OneToOne
- @JoinColumn(name = "river_id")
- public River getRiver() {
- return river;
- }
-
- public void setRiver(River river) {
- this.river = river;
- }
-
- @OneToOne
@JoinColumn(name = "discharge_zone_id")
public DischargeZone getDischargeZone() {
return dischargeZone;
@@ -106,16 +91,13 @@
}
- public static List<FlowVelocityModel> getModels(
- River river,
- DischargeZone zone
- ) {
+ public static List<FlowVelocityModel> getModels(DischargeZone zone) {
+
Session session = SessionHolder.HOLDER.get();
Query query = session.createQuery(
- "from FlowVelocityModel where river=:river and dischargeZone=:zone");
+ "from FlowVelocityModel where dischargeZone=:zone");
- query.setParameter("river", river);
query.setParameter("zone", zone);
return query.list();
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/HWSKind.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/src/main/java/de/intevation/flys/model/HWSKind.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,44 @@
+package de.intevation.flys.model;
+
+import java.io.Serializable;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.Id;
+import javax.persistence.Table;
+
+ at Entity
+ at Table(name = "hws_kinds")
+public class HWSKind implements Serializable {
+ private Integer id;
+ private String name;
+
+ @Id
+ @Column(name = "id")
+ public Integer getId() {
+ return id;
+ }
+
+ public void setId(Integer id) {
+ this.id = id;
+ }
+
+ /**
+ * Get name.
+ *
+ * @return The name of the Hochwasserschutzanlagenart as String.
+ */
+ @Column(name = "kind")
+ public String getName() {
+ return name;
+ }
+
+ /**
+ * Set name.
+ *
+ * @param name the value to set.
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/HWSLine.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/src/main/java/de/intevation/flys/model/HWSLine.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,336 @@
+package de.intevation.flys.model;
+
+import com.vividsolutions.jts.geom.Geometry;
+
+import de.intevation.flys.model.HWSKind;
+
+import java.io.Serializable;
+import java.util.List;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.GeneratedValue;
+import javax.persistence.GenerationType;
+import javax.persistence.Id;
+import javax.persistence.JoinColumn;
+import javax.persistence.OneToOne;
+import javax.persistence.Table;
+import javax.persistence.SequenceGenerator;
+
+import org.hibernate.Session;
+import org.hibernate.Query;
+import org.hibernate.annotations.Type;
+
+import de.intevation.flys.backend.SessionHolder;
+
+ at Entity
+ at Table(name = "hws_lines")
+public class HWSLine implements Serializable {
+
+ private Integer id;
+
+ private Integer ogrFid;
+ private HWSKind kind;
+ private FedState fedState;
+ private River river;
+ private Integer official;
+ private Integer shoreSide;
+ private String name;
+ private String path;
+ private String agency;
+ private String range;
+ private String source;
+ private String status_date;
+ private String description;
+ private Geometry geom;
+
+ @Id
+ @SequenceGenerator(
+ name = "SEQUENCE_HWS_LINES_ID_SEQ",
+ sequenceName = "HWS_LINES_ID_SEQ",
+ allocationSize = 1)
+ @GeneratedValue(
+ strategy = GenerationType.SEQUENCE,
+ generator = "SEQUENCE_HWS_LINES_ID_SEQ")
+ @Column(name = "id")
+ public Integer getId() {
+ return id;
+ }
+
+ public void setId(Integer id) {
+ this.id = id;
+ }
+
+ @Column(name = "geom")
+ @Type(type = "org.hibernatespatial.GeometryUserType")
+ public Geometry getGeom() {
+ return geom;
+ }
+
+
+ public void setGeom(Geometry geom) {
+ this.geom = geom;
+ }
+
+ /**
+ * Get ogrFid.
+ *
+ * @return ogrFid as Integer.
+ */
+ @Column(name = "ogr_fid")
+ public Integer getOgrFid() {
+ return ogrFid;
+ }
+
+ /**
+ * Set ogrFid.
+ *
+ * @param ogrFid the value to set.
+ */
+ public void setOgrFid(Integer ogrFid) {
+ this.ogrFid = ogrFid;
+ }
+
+
+ /**
+ * Get official.
+ *
+ * @return official as Integer.
+ */
+ @Column(name = "official")
+ public Integer getofficial() {
+ return official;
+ }
+
+ /**
+ * Set official.
+ *
+ * @param official the value to set.
+ */
+ public void setofficial(Integer official) {
+ this.official = official;
+ }
+
+ /**
+ * Get shoreSide.
+ *
+ * @return shoreSide as Integer.
+ */
+ @Column(name = "shore_side")
+ public Integer getShoreSide() {
+ return shoreSide;
+ }
+
+ /**
+ * Set shoreSide.
+ *
+ * @param shoreSide the value to set.
+ */
+ public void setShoreSide(Integer shoreSide) {
+ this.shoreSide = shoreSide;
+ }
+
+ /**
+ * Get name.
+ *
+ * @return name as String.
+ */
+ @Column(name = "name")
+ public String getName() {
+ return name;
+ }
+
+ /**
+ * Set name.
+ *
+ * @param name the value to set.
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ /**
+ * Get path.
+ *
+ * @return path as String.
+ */
+ @Column(name = "path")
+ public String getPath() {
+ return path;
+ }
+
+ /**
+ * Set path.
+ *
+ * @param path the value to set.
+ */
+ public void setPath(String path) {
+ this.path = path;
+ }
+
+ /**
+ * Get agency.
+ *
+ * @return agency as String.
+ */
+ @Column(name = "agency")
+ public String getAgency() {
+ return agency;
+ }
+
+ /**
+ * Set agency.
+ *
+ * @param agency the value to set.
+ */
+ public void setAgency(String agency) {
+ this.agency = agency;
+ }
+
+ /**
+ * Get range.
+ *
+ * @return range as String.
+ */
+ @Column(name = "range")
+ public String getRange() {
+ return range;
+ }
+
+ /**
+ * Set range.
+ *
+ * @param range the value to set.
+ */
+ public void setRange(String range) {
+ this.range = range;
+ }
+
+ /**
+ * Get source.
+ *
+ * @return source as String.
+ */
+ @Column(name = "source")
+ public String getSource() {
+ return source;
+ }
+
+ /**
+ * Set source.
+ *
+ * @param source the value to set.
+ */
+ public void setSource(String source) {
+ this.source = source;
+ }
+
+ /**
+ * Get status_date.
+ *
+ * @return status_date as String.
+ */
+ @Column(name = "status_date")
+ public String getStatusDate() {
+ return status_date;
+ }
+
+ /**
+ * Set status_date.
+ *
+ * @param status_date the value to set.
+ */
+ public void setStatusDate(String status_date) {
+ this.status_date = status_date;
+ }
+
+ /**
+ * Get description.
+ *
+ * @return description as String.
+ */
+ @Column(name = "description")
+ public String getDescription() {
+ return description;
+ }
+
+ /**
+ * Set description.
+ *
+ * @param description the value to set.
+ */
+ public void setDescription(String description) {
+ this.description = description;
+ }
+
+ /**
+ * Get kind.
+ *
+ * @return kind as HWSKind.
+ */
+ @OneToOne
+ @JoinColumn(name = "kind_id")
+ public HWSKind getKind() {
+ return kind;
+ }
+
+ /**
+ * Set kind.
+ *
+ * @param kind the value to set.
+ */
+ public void setKind(HWSKind kind) {
+ this.kind = kind;
+ }
+
+ /**
+ * Get fedState.
+ *
+ * @return fedState as FedState.
+ */
+ @OneToOne
+ @JoinColumn(name = "fed_state_id")
+ public FedState getFedState() {
+ return fedState;
+ }
+
+ /**
+ * Set fedState.
+ *
+ * @param fedState the value to set.
+ */
+ public void setFedState(FedState fedState) {
+ this.fedState = fedState;
+ }
+
+ /**
+ * Get river.
+ *
+ * @return river as River.
+ */
+ @OneToOne
+ @JoinColumn(name = "river_id")
+ public River getRiver() {
+ return river;
+ }
+
+ /**
+ * Set river.
+ *
+ * @param river the value to set.
+ */
+ public void setRiver(River river) {
+ this.river = river;
+ }
+
+ public static List<HWSLine> getLines(int riverId, String name) {
+ Session session = SessionHolder.HOLDER.get();
+
+ Query query = session.createQuery(
+ "from HWSLine where river.id =:river_id and name=:name");
+ query.setParameter("river_id", riverId);
+ query.setParameter("name", name);
+
+ return query.list();
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/HWSPoint.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/src/main/java/de/intevation/flys/model/HWSPoint.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,457 @@
+package de.intevation.flys.model;
+
+import com.vividsolutions.jts.geom.Geometry;
+
+import java.io.Serializable;
+import java.util.List;
+
+import java.math.BigDecimal;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.GeneratedValue;
+import javax.persistence.GenerationType;
+import javax.persistence.Id;
+import javax.persistence.JoinColumn;
+import javax.persistence.OneToOne;
+import javax.persistence.Table;
+import javax.persistence.SequenceGenerator;
+
+import org.hibernate.annotations.Type;
+import org.hibernate.Session;
+import org.hibernate.Query;
+
+import de.intevation.flys.backend.SessionHolder;
+
+ at Entity
+ at Table(name = "hws_points")
+public class HWSPoint implements Serializable {
+
+ private Integer id;
+
+ private Integer ogrFid;
+ private HWSKind kind;
+ private FedState fedState;
+ private River river;
+ private Integer official;
+ private Integer shoreSide;
+ private String name;
+ private String path;
+ private String agency;
+ private String range;
+ private String source;
+ private String statusDate;
+ private String description;
+ private BigDecimal freeboard;
+ private BigDecimal dikeKm;
+ private BigDecimal z;
+ private BigDecimal zTarget;
+ private BigDecimal ratedLevel;
+ private Geometry geom;
+
+ @Id
+ @SequenceGenerator(
+ name = "SEQUENCE_HWS_POINTS_ID_SEQ",
+ sequenceName = "HWS_POINTS_ID_SEQ",
+ allocationSize = 1)
+ @GeneratedValue(
+ strategy = GenerationType.SEQUENCE,
+ generator = "SEQUENCE_HWS_POINTS_ID_SEQ")
+ @Column(name = "id")
+ public Integer getId() {
+ return id;
+ }
+
+ public void setId(Integer id) {
+ this.id = id;
+ }
+
+
+ @Column(name = "geom")
+ @Type(type = "org.hibernatespatial.GeometryUserType")
+ public Geometry getGeom() {
+ return geom;
+ }
+
+
+ public void setGeom(Geometry geom) {
+ this.geom = geom;
+ }
+
+ /**
+ * Get ogrFid.
+ *
+ * @return ogrFid as Integer.
+ */
+ @Column(name = "ogr_fid")
+ public Integer getOgrFid() {
+ return ogrFid;
+ }
+
+ /**
+ * Set ogrFid.
+ *
+ * @param ogrFid the value to set.
+ */
+ public void setOgrFid(Integer ogrFid) {
+ this.ogrFid = ogrFid;
+ }
+
+
+ /**
+ * Get official.
+ *
+ * @return official as Integer.
+ */
+ @Column(name = "official")
+ public Integer getofficial() {
+ return official;
+ }
+
+ /**
+ * Set official.
+ *
+ * @param official the value to set.
+ */
+ public void setofficial(Integer official) {
+ this.official = official;
+ }
+
+ /**
+ * Get shoreSide.
+ *
+ * @return shoreSide as Integer.
+ */
+ @Column(name = "shore_side")
+ public Integer getShoreSide() {
+ return shoreSide;
+ }
+
+ /**
+ * Set shoreSide.
+ *
+ * @param shoreSide the value to set.
+ */
+ public void setShoreSide(Integer shoreSide) {
+ this.shoreSide = shoreSide;
+ }
+
+ /**
+ * Get name.
+ *
+ * @return name as String.
+ */
+ @Column(name = "name")
+ public String getName() {
+ return name;
+ }
+
+ /**
+ * Set name.
+ *
+ * @param name the value to set.
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ /**
+ * Get path.
+ *
+ * @return path as String.
+ */
+ @Column(name = "path")
+ public String getPath() {
+ return path;
+ }
+
+ /**
+ * Set path.
+ *
+ * @param path the value to set.
+ */
+ public void setPath(String path) {
+ this.path = path;
+ }
+
+ /**
+ * Get agency.
+ *
+ * @return agency as String.
+ */
+ @Column(name = "agency")
+ public String getAgency() {
+ return agency;
+ }
+
+ /**
+ * Set agency.
+ *
+ * @param agency the value to set.
+ */
+ public void setAgency(String agency) {
+ this.agency = agency;
+ }
+
+ /**
+ * Get range.
+ *
+ * @return range as String.
+ */
+ @Column(name = "range")
+ public String getRange() {
+ return range;
+ }
+
+ /**
+ * Set range.
+ *
+ * @param range the value to set.
+ */
+ public void setRange(String range) {
+ this.range = range;
+ }
+
+ /**
+ * Get source.
+ *
+ * @return source as String.
+ */
+ @Column(name = "source")
+ public String getSource() {
+ return source;
+ }
+
+ /**
+ * Set source.
+ *
+ * @param source the value to set.
+ */
+ public void setSource(String source) {
+ this.source = source;
+ }
+
+ /**
+ * Get statusDate.
+ *
+ * @return statusDate as String.
+ */
+ @Column(name = "status_date")
+ public String getStatusDate() {
+ return statusDate;
+ }
+
+ /**
+ * Set statusDate.
+ *
+ * @param statusDate the value to set.
+ */
+ public void setStatusDate(String statusDate)
+ {
+ this.statusDate = statusDate;
+ }
+
+ /**
+ * Get description.
+ *
+ * @return description as String.
+ */
+ @Column(name = "description")
+ public String getDescription()
+ {
+ return description;
+ }
+
+ /**
+ * Set description.
+ *
+ * @param description the value to set.
+ */
+ public void setDescription(String description)
+ {
+ this.description = description;
+ }
+
+ /**
+ * Get freeboard.
+ *
+ * @return freeboard as BigDecimal.
+ */
+ @Column(name = "freeboard")
+ public BigDecimal getFreeboard()
+ {
+ return freeboard;
+ }
+
+ /**
+ * Set freeboard.
+ *
+ * @param freeboard the value to set.
+ */
+ public void setFreeboard(BigDecimal freeboard)
+ {
+ this.freeboard = freeboard;
+ }
+
+ /**
+ * Get dikeKm.
+ *
+ * @return dikeKm as BigDecimal.
+ */
+ @Column(name = "dike_km")
+ public BigDecimal getDikeKm()
+ {
+ return dikeKm;
+ }
+
+ /**
+ * Set dikeKm.
+ *
+ * @param dikeKm the value to set.
+ */
+ public void setDikeKm(BigDecimal dikeKm)
+ {
+ this.dikeKm = dikeKm;
+ }
+
+ /**
+ * Get z.
+ *
+ * @return z as BigDecimal.
+ */
+ @Column(name = "z")
+ public BigDecimal getZ()
+ {
+ return z;
+ }
+
+ /**
+ * Set z.
+ *
+ * @param z the value to set.
+ */
+ public void setZ(BigDecimal z)
+ {
+ this.z = z;
+ }
+
+ /**
+ * Get zTarget.
+ *
+ * @return zTarget as BigDecimal.
+ */
+ @Column(name = "z_target")
+ public BigDecimal getZTarget()
+ {
+ return zTarget;
+ }
+
+ /**
+ * Set zTarget.
+ *
+ * @param zTarget the value to set.
+ */
+ public void setZTarget(BigDecimal zTarget)
+ {
+ this.zTarget = zTarget;
+ }
+
+ /**
+ * Get ratedLevel.
+ *
+ * @return ratedLevel as BigDecimal.
+ */
+ @Column(name = "rated_level")
+ public BigDecimal getRatedLevel()
+ {
+ return ratedLevel;
+ }
+
+ /**
+ * Set ratedLevel.
+ *
+ * @param ratedLevel the value to set.
+ */
+ public void setRatedLevel(BigDecimal ratedLevel)
+ {
+ this.ratedLevel = ratedLevel;
+ }
+
+ /**
+ * Get kind.
+ *
+ * @return kind as HWSKind.
+ */
+ @OneToOne
+ @JoinColumn(name = "kind_id")
+ public HWSKind getKind()
+ {
+ return kind;
+ }
+
+ /**
+ * Set kind.
+ *
+ * @param kind the value to set.
+ */
+ public void setKind(HWSKind kind)
+ {
+ this.kind = kind;
+ }
+
+ /**
+ * Get fedState.
+ *
+ * @return fedState as FedState.
+ */
+ @OneToOne
+ @JoinColumn(name = "fed_state_id")
+ public FedState getFedState()
+ {
+ return fedState;
+ }
+
+ /**
+ * Set fedState.
+ *
+ * @param fedState the value to set.
+ */
+ public void setFedState(FedState fedState)
+ {
+ this.fedState = fedState;
+ }
+
+ /**
+ * Get river.
+ *
+ * @return river as River.
+ */
+ @OneToOne
+ @JoinColumn(name = "river_id")
+ public River getRiver()
+ {
+ return river;
+ }
+
+ /**
+ * Set river.
+ *
+ * @param river the value to set.
+ */
+ public void setRiver(River river)
+ {
+ this.river = river;
+ }
+
+ public static List<HWSPoint> getPoints(int riverId, String name) {
+ Session session = SessionHolder.HOLDER.get();
+
+ Query query = session.createQuery(
+ "from HWSPoint where river.id =:river_id and name=:name");
+ query.setParameter("river_id", riverId);
+ query.setParameter("name", name);
+
+ return query.list();
+ }
+}
+
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/Hws.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/Hws.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,106 +0,0 @@
-package de.intevation.flys.model;
-
-import java.io.Serializable;
-import java.util.List;
-
-import javax.persistence.Column;
-import javax.persistence.Entity;
-import javax.persistence.Id;
-import javax.persistence.JoinColumn;
-import javax.persistence.OneToOne;
-import javax.persistence.Table;
-
-import org.hibernate.Session;
-import org.hibernate.Query;
-import org.hibernate.annotations.Type;
-
-import com.vividsolutions.jts.geom.LineString;
-
-import de.intevation.flys.backend.SessionHolder;
-
-
- at Entity
- at Table(name = "hws")
-public class Hws
-implements Serializable
-{
- private Integer id;
- private String facility;
- private String type;
- private River river;
- private LineString geom;
-
- public Hws() {
- }
-
-
- @Id
- @Column(name = "id")
- public Integer getId() {
- return id;
- }
-
-
- public void setId(Integer id) {
- this.id = id;
- }
-
-
- @OneToOne
- @JoinColumn(name = "river_id")
- public River getRiver() {
- return river;
- }
-
-
- public void setRiver(River river) {
- this.river = river;
- }
-
-
- @Column(name = "hws_facility")
- public String getFacility() {
- return facility;
- }
-
-
- public void setFacility(String facility) {
- this.facility = facility;
- }
-
-
- @Column(name = "type")
- public String getType() {
- return type;
- }
-
-
- public void setType(String type) {
- this.type = type;
- }
-
-
- @Column(name = "geom")
- @Type(type = "org.hibernatespatial.GeometryUserType")
- public LineString getGeom() {
- return geom;
- }
-
-
- public void setGeom(LineString geom) {
- this.geom = geom;
- }
-
-
- public static List<Hws> getHws(int riverId, String name) {
- Session session = SessionHolder.HOLDER.get();
-
- Query query = session.createQuery(
- "from Hws where river.id =:river_id and name=:name");
- query.setParameter("river_id", riverId);
- query.setParameter("name", name);
-
- return query.list();
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/HydrBoundary.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/HydrBoundary.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/model/HydrBoundary.java Fri Mar 22 11:25:54 2013 +0100
@@ -5,16 +5,19 @@
import javax.persistence.Column;
import javax.persistence.Entity;
+import javax.persistence.GeneratedValue;
+import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.JoinColumn;
import javax.persistence.OneToOne;
import javax.persistence.Table;
+import javax.persistence.SequenceGenerator;
import org.hibernate.Session;
import org.hibernate.Query;
import org.hibernate.annotations.Type;
-import com.vividsolutions.jts.geom.LineString;
+import com.vividsolutions.jts.geom.MultiLineString;
import de.intevation.flys.backend.SessionHolder;
@@ -25,15 +28,25 @@
implements Serializable
{
private Integer id;
+ private SectieKind sectie;
+ private SobekKind sobek;
private String name;
private River river;
- private LineString geom;
+ private MultiLineString geom;
+ private BoundaryKind kind;
public HydrBoundary() {
}
@Id
+ @SequenceGenerator(
+ name = "SEQUENCE_HYDR_BOUNDARIES_ID_SEQ",
+ sequenceName = "HYDR_BOUNDARIES_ID_SEQ",
+ allocationSize = 1)
+ @GeneratedValue(
+ strategy = GenerationType.SEQUENCE,
+ generator = "SEQUENCE_HYDR_BOUNDARIES_ID_SEQ")
@Column(name = "id")
public Integer getId() {
return id;
@@ -44,7 +57,6 @@
this.id = id;
}
-
@OneToOne
@JoinColumn(name = "river_id")
public River getRiver() {
@@ -70,12 +82,12 @@
@Column(name = "geom")
@Type(type = "org.hibernatespatial.GeometryUserType")
- public LineString getGeom() {
+ public MultiLineString getGeom() {
return geom;
}
- public void setGeom(LineString geom) {
+ public void setGeom(MultiLineString geom) {
this.geom = geom;
}
@@ -90,5 +102,71 @@
return query.list();
}
+
+ /**
+ * Get sectie.
+ *
+ * @return sectie as SectieKind.
+ */
+ @OneToOne
+ @JoinColumn(name = "sectie")
+ public SectieKind getSectie()
+ {
+ return sectie;
+ }
+
+ /**
+ * Set sectie.
+ *
+ * @param sectie the value to set.
+ */
+ public void setSectie(SectieKind sectie)
+ {
+ this.sectie = sectie;
+ }
+
+ /**
+ * Get sobek.
+ *
+ * @return sobek as SobekKind.
+ */
+ @OneToOne
+ @JoinColumn(name = "sobek")
+ public SobekKind getSobek()
+ {
+ return sobek;
+ }
+
+ /**
+ * Set sobek.
+ *
+ * @param sobek the value to set.
+ */
+ public void setSobek(SobekKind sobek)
+ {
+ this.sobek = sobek;
+ }
+
+ /**
+ * Get kind.
+ *
+ * @return kind as BoundaryKind.
+ */
+ @OneToOne
+ @JoinColumn(name = "kind")
+ public BoundaryKind getKind()
+ {
+ return kind;
+ }
+
+ /**
+ * Set kind.
+ *
+ * @param kind the value to set.
+ */
+ public void setKind(BoundaryKind kind)
+ {
+ this.kind = kind;
+ }
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/HydrBoundaryPoly.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/HydrBoundaryPoly.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/model/HydrBoundaryPoly.java Fri Mar 22 11:25:54 2013 +0100
@@ -5,10 +5,13 @@
import javax.persistence.Column;
import javax.persistence.Entity;
+import javax.persistence.GeneratedValue;
+import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.JoinColumn;
import javax.persistence.OneToOne;
import javax.persistence.Table;
+import javax.persistence.SequenceGenerator;
import org.hibernate.Session;
import org.hibernate.Query;
@@ -28,12 +31,22 @@
private String name;
private River river;
private Geometry geom;
+ private SectieKind sectie;
+ private SobekKind sobek;
+ private BoundaryKind kind;
public HydrBoundaryPoly() {
}
@Id
+ @SequenceGenerator(
+ name = "SEQUENCE_HYDR_BOUNDARIES_POLY_ID_SEQ",
+ sequenceName = "HYDR_BOUNDARIES_POLY_ID_SEQ",
+ allocationSize = 1)
+ @GeneratedValue(
+ strategy = GenerationType.SEQUENCE,
+ generator = "SEQUENCE_HYDR_BOUNDARIES_POLY_ID_SEQ")
@Column(name = "id")
public Integer getId() {
return id;
@@ -90,5 +103,71 @@
return query.list();
}
+
+ /**
+ * Get sectie.
+ *
+ * @return sectie as SectieKind.
+ */
+ @OneToOne
+ @JoinColumn(name = "sectie")
+ public SectieKind getSectie()
+ {
+ return sectie;
+ }
+
+ /**
+ * Set sectie.
+ *
+ * @param sectie the value to set.
+ */
+ public void setSectie(SectieKind sectie)
+ {
+ this.sectie = sectie;
+ }
+
+ /**
+ * Get sobek.
+ *
+ * @return sobek as SobekKind.
+ */
+ @OneToOne
+ @JoinColumn(name = "sobek")
+ public SobekKind getSobek()
+ {
+ return sobek;
+ }
+
+ /**
+ * Set sobek.
+ *
+ * @param sobek the value to set.
+ */
+ public void setSobek(SobekKind sobek)
+ {
+ this.sobek = sobek;
+ }
+
+ /**
+ * Get kind.
+ *
+ * @return kind as BoundaryKind.
+ */
+ @OneToOne
+ @JoinColumn(name = "kind")
+ public BoundaryKind getKind()
+ {
+ return kind;
+ }
+
+ /**
+ * Set kind.
+ *
+ * @param kind the value to set.
+ */
+ public void setKind(BoundaryKind kind)
+ {
+ this.kind = kind;
+ }
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/Line.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/Line.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,108 +0,0 @@
-package de.intevation.flys.model;
-
-import java.io.Serializable;
-import java.math.BigDecimal;
-import java.util.List;
-
-import javax.persistence.Column;
-import javax.persistence.Entity;
-import javax.persistence.Id;
-import javax.persistence.JoinColumn;
-import javax.persistence.OneToOne;
-import javax.persistence.Table;
-
-import org.hibernate.Session;
-import org.hibernate.Query;
-
-import org.hibernate.annotations.Type;
-
-import com.vividsolutions.jts.geom.LineString;
-
-import de.intevation.flys.backend.SessionHolder;
-
-
- at Entity
- at Table(name = "lines")
-public class Line
-implements Serializable
-{
- private Integer id;
- private String kind;
- private River river;
- private LineString geom;
- private BigDecimal z;
-
- public Line() {
- }
-
-
- @Id
- @Column(name = "id")
- public Integer getId() {
- return id;
- }
-
-
- public void setId(Integer id) {
- this.id = id;
- }
-
-
- @OneToOne
- @JoinColumn(name = "river_id")
- public River getRiver() {
- return river;
- }
-
-
- public void setRiver(River river) {
- this.river = river;
- }
-
-
- @Column(name = "kind")
- public String getKind() {
- return kind;
- }
-
-
- public void setKind(String kind) {
- this.kind = kind;
- }
-
-
- @Column(name = "geom")
- @Type(type = "org.hibernatespatial.GeometryUserType")
- public LineString getGeom() {
- return geom;
- }
-
-
- public void setGeom(LineString geom) {
- this.geom = geom;
- }
-
-
- @Column(name = "z")
- public BigDecimal getZ() {
- return z;
- }
-
-
- public void setZ(BigDecimal z) {
- this.z = z;
- }
-
-
- public static List<Line> getLines(int riverId, String name) {
- Session session = SessionHolder.HOLDER.get();
-
- Query query = session.createQuery(
- "from Line where river.id =:river_id and name=:name");
- query.setParameter("river_id", riverId);
- query.setParameter("name", name);
-
- return query.list();
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/MeasurementStation.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/MeasurementStation.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/model/MeasurementStation.java Fri Mar 22 11:25:54 2013 +0100
@@ -21,7 +21,7 @@
private String measurementType;
private String riverside;
private String operator;
- private String comment;
+ private String description;
private Double station;
private Range range;
@@ -37,7 +37,7 @@
public MeasurementStation(River river, String name, String measurementType,
String riverside, Double station, Range range, Gauge gauge,
- TimeInterval observationTimerange, String operator, String comment) {
+ TimeInterval observationTimerange, String operator, String description) {
this.river = river;
this.name = name;
this.measurementType = measurementType;
@@ -47,7 +47,7 @@
this.gauge = gauge;
this.observationTimerange = observationTimerange;
this.operator = operator;
- this.comment = comment;
+ this.description = description;
}
@Id
@@ -147,13 +147,13 @@
this.operator = operator;
}
- @Column(name = "comment")
- public String getComment() {
- return comment;
+ @Column(name = "description")
+ public String getDescription() {
+ return description;
}
- public void setComment(String comment) {
- this.comment = comment;
+ public void setDescription(String description) {
+ this.description = description;
}
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/RiverAxis.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/RiverAxis.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/model/RiverAxis.java Fri Mar 22 11:25:54 2013 +0100
@@ -10,21 +10,21 @@
import javax.persistence.OneToOne;
import javax.persistence.Table;
+import org.hibernate.HibernateException;
import org.hibernate.Session;
import org.hibernate.Query;
import org.hibernate.annotations.Type;
-import com.vividsolutions.jts.geom.LineString;
+import com.vividsolutions.jts.geom.MultiLineString;
import de.intevation.flys.backend.SessionHolder;
+import de.intevation.flys.model.AxisKind;
/**
- * There is a modeling problem with the RiverAxis. The initial idea was, that a
- * river can have a riveraxis that consist of exact one geometry. Now, it has
- * turned out, that a single geometry is not enough for a riveraxis (arm of a
- * river, inflows, ...). As workaround, we now expect, that a river can just
- * have a single riveraxis.
+ * A river has one axis that is used for calculation.
+ * Additional axes of a river can be used to be painted int maps etc.
+ * which one is the main river axis can be determined over the axis kind.
*/
@Entity
@Table(name = "river_axes")
@@ -32,14 +32,13 @@
implements Serializable
{
private Integer id;
- private Integer kind;
+ private AxisKind kind;
private River river;
- private LineString geom;
+ private MultiLineString geom;
- public static final int DEFAULT_KIND = 0;
-
- public static final int KIND_OFFICIAL = 1;
- public static final int KIND_OUTSOURCED = 2;
+ public static final int KIND_UNKOWN = 0;
+ public static final int KIND_CURRENT = 1;
+ public static final int KIND_OTHER = 2;
public RiverAxis() {
}
@@ -69,43 +68,53 @@
}
- @Column(name = "kind")
- public Integer getKind() {
+ /**
+ * Get kind.
+ *
+ * @return kind as AxisKind.
+ */
+ @OneToOne
+ @JoinColumn(name = "kind_id")
+ public AxisKind getKind() {
return kind;
}
-
- public void setKind(Integer kind) {
+ /**
+ * Set kind.
+ *
+ * @param kind the value to set.
+ */
+ public void setKind(AxisKind kind) {
this.kind = kind;
}
@Column(name = "geom")
@Type(type = "org.hibernatespatial.GeometryUserType")
- public LineString getGeom() {
+ public MultiLineString getGeom() {
return geom;
}
- public void setGeom(LineString geom) {
+ public void setGeom(MultiLineString geom) {
this.geom = geom;
}
- public static List<RiverAxis> getRiverAxis(String river) {
- return getRiverAxis(river, DEFAULT_KIND);
+ public static List<RiverAxis> getRiverAxis(String river)
+ throws IllegalArgumentException {
+ return getRiverAxis(river, KIND_CURRENT);
}
- public static List<RiverAxis> getRiverAxis(String river, int kind) {
+ public static List<RiverAxis> getRiverAxis(String river, int kind)
+ throws HibernateException {
Session session = SessionHolder.HOLDER.get();
-
Query query = session.createQuery(
- "from RiverAxis where river.name =:river AND kind =:kind");
+ "from RiverAxis where river.name =:river AND kind.id =:kind");
query.setParameter("river", river);
query.setParameter("kind", kind);
List<RiverAxis> list = query.list();
-
return list.isEmpty() ? null : list;
}
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/SectieKind.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/src/main/java/de/intevation/flys/model/SectieKind.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,44 @@
+package de.intevation.flys.model;
+
+import java.io.Serializable;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.Id;
+import javax.persistence.Table;
+
+ at Entity
+ at Table(name = "sectie_kinds")
+public class SectieKind implements Serializable {
+ private Integer id;
+ private String name;
+
+ @Id
+ @Column(name = "id")
+ public Integer getId() {
+ return id;
+ }
+
+ public void setId(Integer id) {
+ this.id = id;
+ }
+
+ /**
+ * Get name.
+ *
+ * @return name of the kind of sectie as String.
+ */
+ @Column(name = "name")
+ public String getName() {
+ return name;
+ }
+
+ /**
+ * Set name.
+ *
+ * @param name the value to set.
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/SobekKind.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-backend/src/main/java/de/intevation/flys/model/SobekKind.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,45 @@
+package de.intevation.flys.model;
+
+import java.io.Serializable;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.Id;
+import javax.persistence.Table;
+
+ at Entity
+ at Table(name = "sobek_kinds")
+public class SobekKind implements Serializable {
+
+ private Integer id;
+ private String name;
+
+ @Id
+ @Column(name = "id")
+ public Integer getId() {
+ return id;
+ }
+
+ public void setId(Integer id) {
+ this.id = id;
+ }
+
+ /**
+ * Get name.
+ *
+ * @return name of the kind of sobek as String.
+ */
+ @Column(name = "name")
+ public String getName() {
+ return name;
+ }
+
+ /**
+ * Set name.
+ *
+ * @param name the value to set.
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/Waterlevel.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/Waterlevel.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,113 +0,0 @@
-package de.intevation.flys.model;
-
-import java.io.Serializable;
-import java.util.List;
-
-import javax.persistence.Entity;
-import javax.persistence.Id;
-import javax.persistence.Table;
-import javax.persistence.GeneratedValue;
-import javax.persistence.Column;
-import javax.persistence.SequenceGenerator;
-import javax.persistence.GenerationType;
-import javax.persistence.JoinColumn;
-import javax.persistence.OneToOne;
-import javax.persistence.OneToMany;
-
-import org.apache.log4j.Logger;
-
-
-
-/** Mapped Waterlevel. */
- at Entity
- at Table(name = "waterlevel")
-public class Waterlevel
-implements Serializable
-{
- private static Logger logger = Logger.getLogger(Waterlevel.class);
-
- private Integer id;
-
- private River river;
-
- private Unit unit;
-
- private String description;
-
- private List<WaterlevelQRange> qRanges;
-
-
- public Waterlevel() {
- }
-
- public Waterlevel(River river, Unit unit) {
- this.river = river;
- this.unit = unit;
- }
-
- public Waterlevel(River river, Unit unit, String description) {
- this(river, unit);
- this.description = description;
- }
-
- @Id
- @SequenceGenerator(
- name = "SEQUENCE_WATERLEVEL_ID_SEQ",
- sequenceName = "WATERLEVEL_ID_SEQ",
- allocationSize = 1)
- @GeneratedValue(
- strategy = GenerationType.SEQUENCE,
- generator = "SEQUENCE_WATERLEVEL_ID_SEQ")
- @Column(name = "id")
- public Integer getId() {
- return id;
- }
-
- public void setId(Integer id) {
- this.id = id;
- }
-
- @OneToOne
- @JoinColumn(name = "river_id" )
- public River getRiver() {
- return river;
- }
-
- public void setRiver(River river) {
- this.river = river;
- }
-
- @OneToOne
- @JoinColumn(name = "unit_id")
- public Unit getUnit() {
- return unit;
- }
-
- public void setUnit(Unit unit) {
- this.unit = unit;
- }
-
- @Column(name = "description")
- public String getDescription() {
- return description;
- }
-
- public void setDescription(String description) {
- this.description = description;
- }
-
- @OneToMany
- @JoinColumn(name="waterlevel_id")
- public List<WaterlevelQRange> getQRanges() {
- return qRanges;
- }
-
- public void setQRanges(List<WaterlevelQRange> qRanges) {
- this.qRanges = qRanges;
- }
-
- public void addQRange(WaterlevelQRange qRange) {
- qRanges.add(qRange);
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/WaterlevelDifference.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/WaterlevelDifference.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,119 +0,0 @@
-package de.intevation.flys.model;
-
-import java.io.Serializable;
-import java.util.ArrayList;
-import java.util.List;
-
-import javax.persistence.Entity;
-import javax.persistence.Id;
-import javax.persistence.Table;
-import javax.persistence.GeneratedValue;
-import javax.persistence.Column;
-import javax.persistence.SequenceGenerator;
-import javax.persistence.GenerationType;
-import javax.persistence.JoinColumn;
-import javax.persistence.OneToOne;
-import javax.persistence.OneToMany;
-
-import org.apache.log4j.Logger;
-
-
- at Entity
- at Table(name = "waterlevel_difference")
-public class WaterlevelDifference
-implements Serializable
-{
- private static Logger logger = Logger.getLogger(WaterlevelDifference.class);
-
- private Integer id;
-
- private River river;
-
- private Unit unit;
-
- private List<WaterlevelDifferenceColumn> columns;
-
- private String description;
-
-
- public WaterlevelDifference() {
- columns = new ArrayList<WaterlevelDifferenceColumn>();
- }
-
-
- public WaterlevelDifference(River river, Unit unit) {
- this();
-
- this.river = river;
- this.unit = unit;
- }
-
-
- public WaterlevelDifference(River river, Unit unit, String description) {
- this(river, unit);
-
- this.description = description;
- }
-
-
- @Id
- @SequenceGenerator(
- name = "SEQUENCE_WATERLEVEL_DIFFERENCE_ID_SEQ",
- sequenceName = "WATERLEVEL_DIFFERENCE_ID_SEQ",
- allocationSize = 1)
- @GeneratedValue(
- strategy = GenerationType.SEQUENCE,
- generator = "SEQUENCE_WATERLEVEL_DIFFERENCE_ID_SEQ")
- @Column(name = "id")
- public Integer getId() {
- return id;
- }
-
- public void setId(Integer id) {
- this.id = id;
- }
-
- @OneToOne
- @JoinColumn(name = "river_id" )
- public River getRiver() {
- return river;
- }
-
- public void setRiver(River river) {
- this.river = river;
- }
-
- @OneToOne
- @JoinColumn(name = "unit_id")
- public Unit getUnit() {
- return unit;
- }
-
- public void setUnit(Unit unit) {
- this.unit = unit;
- }
-
- @Column(name = "description")
- public String getDescription() {
- return description;
- }
-
- public void setDescription(String description) {
- this.description = description;
- }
-
- @OneToMany
- @JoinColumn(name = "difference_id")
- public List<WaterlevelDifferenceColumn> getColumns() {
- return columns;
- }
-
- public void setColumns(List<WaterlevelDifferenceColumn> columns) {
- this.columns = columns;
- }
-
- public void addColumn(WaterlevelDifferenceColumn column) {
- this.columns.add(column);
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/WaterlevelDifferenceColumn.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/WaterlevelDifferenceColumn.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,104 +0,0 @@
-package de.intevation.flys.model;
-
-import java.io.Serializable;
-import java.util.ArrayList;
-import java.util.List;
-
-import javax.persistence.Entity;
-import javax.persistence.Id;
-import javax.persistence.Table;
-import javax.persistence.GeneratedValue;
-import javax.persistence.Column;
-import javax.persistence.SequenceGenerator;
-import javax.persistence.GenerationType;
-import javax.persistence.JoinColumn;
-import javax.persistence.OneToOne;
-import javax.persistence.OneToMany;
-
-import org.apache.log4j.Logger;
-
-
- at Entity
- at Table(name = "waterlevel_difference_column")
-public class WaterlevelDifferenceColumn
-implements Serializable
-{
- private static Logger logger =
- Logger.getLogger(WaterlevelDifferenceColumn.class);
-
-
- private Integer id;
-
- private WaterlevelDifference difference;
-
- private List<WaterlevelDifferenceValue> values;
-
- private String description;
-
-
- public WaterlevelDifferenceColumn() {
- values = new ArrayList<WaterlevelDifferenceValue>();
- }
-
- public WaterlevelDifferenceColumn(
- WaterlevelDifference difference,
- String description
- ) {
- this();
-
- this.difference = difference;
- this.description = description;
- }
-
-
- @Id
- @SequenceGenerator(
- name = "SEQUENCE_WATERLEVEL_DIFF_COLUMN_ID_SEQ",
- sequenceName = "WATERLEVEL_DIFF_COLUMN_ID_SEQ",
- allocationSize = 1)
- @GeneratedValue(
- strategy = GenerationType.SEQUENCE,
- generator = "SEQUENCE_WATERLEVEL_DIFF_COLUMN_ID_SEQ")
- @Column(name = "id")
- public Integer getId() {
- return id;
- }
-
- public void setId(Integer id) {
- this.id = id;
- }
-
- @OneToOne
- @JoinColumn(name = "difference_id" )
- public WaterlevelDifference getDifference() {
- return difference;
- }
-
- public void setDifference(WaterlevelDifference difference) {
- this.difference = difference;
- }
-
- @Column(name = "description")
- public String getDescription() {
- return description;
- }
-
- public void setDescription(String description) {
- this.description = description;
- }
-
- @OneToMany
- @JoinColumn(name = "column_id")
- public List<WaterlevelDifferenceValue> getValues() {
- return values;
- }
-
- public void setValues(List<WaterlevelDifferenceValue> values) {
- this.values = values;
- }
-
- public void addValue(WaterlevelDifferenceValue value) {
- this.values.add(value);
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/WaterlevelDifferenceValue.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/WaterlevelDifferenceValue.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,94 +0,0 @@
-package de.intevation.flys.model;
-
-import java.io.Serializable;
-
-import javax.persistence.Entity;
-import javax.persistence.Id;
-import javax.persistence.Table;
-import javax.persistence.GeneratedValue;
-import javax.persistence.Column;
-import javax.persistence.SequenceGenerator;
-import javax.persistence.GenerationType;
-import javax.persistence.JoinColumn;
-import javax.persistence.OneToOne;
-
-import org.apache.log4j.Logger;
-
-
- at Entity
- at Table(name = "waterlevel_difference_values")
-public class WaterlevelDifferenceValue
-implements Serializable
-{
- private static Logger logger =
- Logger.getLogger(WaterlevelDifferenceValue.class);
-
-
- private Integer id;
-
- private WaterlevelDifferenceColumn column;
-
- private Double station;
- private Double value;
-
-
- public WaterlevelDifferenceValue() {
- }
-
- public WaterlevelDifferenceValue(
- WaterlevelDifferenceColumn column,
- Double station,
- Double value
- ) {
- this.column = column;
- this.station = station;
- this.value = value;
- }
-
-
- @Id
- @SequenceGenerator(
- name = "SEQUENCE_WATERLEVEL_DIFF_VALUES_ID_SEQ",
- sequenceName = "WATERLEVEL_DIFF_VALUES_ID_SEQ",
- allocationSize = 1)
- @GeneratedValue(
- strategy = GenerationType.SEQUENCE,
- generator = "SEQUENCE_WATERLEVEL_DIFF_VALUES_ID_SEQ")
- @Column(name = "id")
- public Integer getId() {
- return id;
- }
-
- public void setId(Integer id) {
- this.id = id;
- }
-
- @OneToOne
- @JoinColumn(name = "column_id" )
- public WaterlevelDifferenceColumn getColumn() {
- return column;
- }
-
- public void setColumn(WaterlevelDifferenceColumn column) {
- this.column = column;
- }
-
- @Column(name = "station")
- public Double getStation() {
- return station;
- }
-
- public void setStation(Double station) {
- this.station = station;
- }
-
- @Column(name = "value")
- public Double getValue() {
- return value;
- }
-
- public void setValue(Double value) {
- this.value = value;
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/WaterlevelQRange.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/WaterlevelQRange.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,100 +0,0 @@
-package de.intevation.flys.model;
-
-import java.io.Serializable;
-import java.util.ArrayList;
-import java.util.List;
-
-import javax.persistence.Entity;
-import javax.persistence.Id;
-import javax.persistence.Table;
-import javax.persistence.GeneratedValue;
-import javax.persistence.Column;
-import javax.persistence.SequenceGenerator;
-import javax.persistence.GenerationType;
-import javax.persistence.JoinColumn;
-import javax.persistence.OneToOne;
-import javax.persistence.OneToMany;
-
-import org.apache.log4j.Logger;
-
-
-
-
- at Entity
- at Table(name = "waterlevel_q_range")
-public class WaterlevelQRange
-implements Serializable
-{
- private static Logger logger = Logger.getLogger(WaterlevelQRange.class);
-
- private Integer id;
-
- private Waterlevel waterlevel;
-
- private Double q;
-
- private List<WaterlevelValue> values;
-
-
- public WaterlevelQRange() {
- this.values = new ArrayList<WaterlevelValue>();
- }
-
- public WaterlevelQRange(Waterlevel waterlevel, Double q) {
- this();
- this.q = q;
- this.waterlevel = waterlevel;
- }
-
-
- @Id
- @SequenceGenerator(
- name = "SEQUENCE_WATERLEVEL_Q_RANGE_ID_SEQ",
- sequenceName = "WATERLEVEL_Q_RANGES_ID_SEQ",
- allocationSize = 1)
- @GeneratedValue(
- strategy = GenerationType.SEQUENCE,
- generator = "SEQUENCE_WATERLEVEL_Q_RANGE_ID_SEQ")
- @Column(name = "id")
- public Integer getId() {
- return id;
- }
-
- public void setId(Integer id) {
- this.id = id;
- }
-
- @OneToOne
- @JoinColumn(name = "waterlevel_id" )
- public Waterlevel getWaterlevel() {
- return waterlevel;
- }
-
- public void setWaterlevel(Waterlevel waterlevel) {
- this.waterlevel = waterlevel;
- }
-
- @Column(name = "q")
- public Double getQ() {
- return q;
- }
-
- public void setQ(Double q) {
- this.q = q;
- }
-
- @OneToMany
- @Column(name = "waterlevel_q_range_id")
- public List<WaterlevelValue> getValues() {
- return values;
- }
-
- public void setValues(List<WaterlevelValue> values) {
- this.values = values;
- }
-
- public void addValue(WaterlevelValue value) {
- values.add(value);
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/model/WaterlevelValue.java
--- a/flys-backend/src/main/java/de/intevation/flys/model/WaterlevelValue.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,90 +0,0 @@
-package de.intevation.flys.model;
-
-import java.io.Serializable;
-
-import javax.persistence.Entity;
-import javax.persistence.Id;
-import javax.persistence.Table;
-import javax.persistence.GeneratedValue;
-import javax.persistence.Column;
-import javax.persistence.SequenceGenerator;
-import javax.persistence.GenerationType;
-import javax.persistence.JoinColumn;
-import javax.persistence.OneToOne;
-
-import org.apache.log4j.Logger;
-
-
-
-
- at Entity
- at Table(name = "waterlevel_values")
-public class WaterlevelValue
-implements Serializable
-{
- private static Logger logger = Logger.getLogger(WaterlevelValue.class);
-
- private Integer id;
-
- private WaterlevelQRange qrange;
-
- private Double station;
- private Double w;
-
-
- public WaterlevelValue() {
- }
-
- public WaterlevelValue(WaterlevelQRange qrange, Double station, Double w) {
- this.qrange = qrange;
- this.station = station;
- this.w = w;
- }
-
-
- @Id
- @SequenceGenerator(
- name = "SEQUENCE_WATERLEVEL_VALUES_ID_SEQ",
- sequenceName = "WATERLEVEL_VALUES_ID_SEQ",
- allocationSize = 1)
- @GeneratedValue(
- strategy = GenerationType.SEQUENCE,
- generator = "SEQUENCE_WATERLEVEL_VALUES_ID_SEQ")
- @Column(name = "id")
- public Integer getId() {
- return id;
- }
-
- public void setId(Integer id) {
- this.id = id;
- }
-
- @OneToOne
- @JoinColumn(name = "waterlevel_q_range_id" )
- public WaterlevelQRange getQrange() {
- return qrange;
- }
-
- public void setQrange(WaterlevelQRange qrange) {
- this.qrange = qrange;
- }
-
- @Column(name = "station")
- public Double getStation() {
- return station;
- }
-
- public void setStation(Double station) {
- this.station = station;
- }
-
- @Column(name = "w")
- public Double getW() {
- return w;
- }
-
- public void setW(Double w) {
- this.w = w;
- }
-}
-// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/utils/DBCPConnectionProvider.java
--- a/flys-backend/src/main/java/de/intevation/flys/utils/DBCPConnectionProvider.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-backend/src/main/java/de/intevation/flys/utils/DBCPConnectionProvider.java Fri Mar 22 11:25:54 2013 +0100
@@ -21,6 +21,8 @@
import java.util.Iterator;
import java.util.Properties;
import java.util.Map;
+import java.util.Collections;
+import java.util.StringTokenizer;
import org.apache.commons.dbcp.BasicDataSource;
import org.apache.commons.dbcp.BasicDataSourceFactory;
@@ -194,6 +196,13 @@
ds = (BasicDataSource)BasicDataSourceFactory
.createDataSource(dbcpProperties);
+ // This needs to be done manually as it is somehow ignored
+ // by the BasicDataSourceFactory if you set it as a dbcpProperty
+ String connectionInitSqls = props.getProperty("connectionInitSqls");
+ if (connectionInitSqls != null) {
+ StringTokenizer tokenizer = new StringTokenizer(connectionInitSqls, ";");
+ ds.setConnectionInitSqls(Collections.list(tokenizer));
+ }
// The BasicDataSource has lazy initialization
// borrowing a connection will start the DataSource
// and make sure it is configured correctly.
diff -r cfc5540a4eec -r 61bf64b102bc flys-backend/src/main/java/de/intevation/flys/utils/DgmSqlConverter.java
--- a/flys-backend/src/main/java/de/intevation/flys/utils/DgmSqlConverter.java Wed Mar 06 14:14:15 2013 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,483 +0,0 @@
-package de.intevation.flys.utils;
-
-import java.io.BufferedInputStream;
-import java.io.BufferedWriter;
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.FileNotFoundException;
-import java.io.FileWriter;
-import java.io.IOException;
-import java.io.InputStream;
-import java.io.InputStreamReader;
-import java.io.Reader;
-import java.util.ArrayList;
-import java.util.List;
-
-import au.com.bytecode.opencsv.CSVReader;
-
-
-/**
- * A converter for CSV files with DGM information. The result of a conversion
- * is an SQL file with "INSERT INTO dem ..." statements.
- * <br>
- * To start the converter, at least the following three system properties are
- * required:
- * <br>
- * <ul>
- * <ol><b>gew.dir</b>: This property must point to the directory where all
- * rivers are stored.</ol>
- * <ol><b>csv</b>: This property must point to the CSV file that contains the
- * DGM information.</ol>
- * <ol><b>sql</b>: This property must point to a (not yet existing) file that
- * will be generated by this converter.</ol>
- * </ul>
- * <br>
- * In addiation, the following properties are accepted to modify log messages,
- * etc.
- * <ul>
- * <ol><b>verbose</b>: Accepts integer values (0, 1, 2, 3) to modify the log
- * messages. The higher the value the more log messages are printed to STDOUT.
- * </ol>
- * <ol><b>full</b>: Accepts true|false values. If true is set, all rivers
- * included in the CSV file are taken into account while parsing. Otherwise,
- * the converter reads information for 'Saar', 'Mosel' and 'Eble' only.</ol>
- * </ul>
- *
- * @author Ingo Weinzierl <a href="mailto:ingo.weinzierl at intevation.de">
- * ingo.weinzierl at intevation.de</a>
- *
- */
-public class DgmSqlConverter {
-
- public static final String SQL_INSERT = "INSERT INTO dem (river_id, name, lower, upper, year_from, year_to,"
- + "projection, elevation_state, format, border_break, resolution, description, path) VALUES ("
- + "%s, '%s', %s, %s, %s, %s, '%s', '%s', '%s', %s, '%s', '%s', '%s');";
-
- public static final String SQL_SELECT_RIVER = "(SELECT id from rivers WHERE name = '%s')";
-
- public static final char DEFAULT_SEPERATOR = ',';
- public static final char DEFAULT_QUOTE = '"';
- public static final int DEFAULT_LOG_LEVEL = 2;
-
- public static final boolean FULL_MODE = Boolean.getBoolean("full");
- public static final String GEW_DIR = System.getProperty("gew.dir", null);
- public static final String CSV_FILE = System.getProperty("csv");
- public static final String SQL_FILE = System.getProperty("sql");
- public static final int LOG_LEVEL = Integer.getInteger("verbose",
- DEFAULT_LOG_LEVEL);
-
- public static final int MIN_COLUMN_COUNT = 15;
-
- public static final int IDX_RIVERNAME = 0;
- public static final int IDX_NAME = 12;
- public static final int IDX_LOWER = 1;
- public static final int IDX_UPPER = 2;
- public static final int IDX_YEAR_FROM = 3;
- public static final int IDX_YEAR_TO = 4;
- public static final int IDX_PROJECTION = 7;
- public static final int IDX_ELEVATION_STATE = 8;
- public static final int IDX_FORMAT = 9;
- public static final int IDX_BORDER_BREAK = 10;
- public static final int IDX_RESOLUTION = 11;
- public static final int IDX_DESCRIPTION = 14;
- public static final int IDX_FILE_NAME = 5;
- public static final int IDX_FILE_PATH = 6;
-
- private class DGM {
-
- public String river;
- public String name;
- public String projection;
- public String elevationState;
- public String format;
- public String resolution;
- public String description;
- public String path;
-
- public double lower;
- public double upper;
- public Integer yearFrom;
- public Integer yearTo;
-
- public boolean borderBreak;
-
- public DGM() {
- borderBreak = false;
- }
-
- public String toSQL() {
- String riverId = String.format(SQL_SELECT_RIVER, river);
- String lower = String.valueOf(this.lower);
- String upper = String.valueOf(this.upper);
- String yearFrom = this.yearFrom != null ? String
- .valueOf(this.yearFrom) : "";
- String yearTo = this.yearTo != null ? String.valueOf(this.yearTo)
- : "";
-
- return String.format(SQL_INSERT, riverId, name, lower, upper,
- yearFrom, yearTo, projection, elevationState, format,
- borderBreak, resolution, description, path);
- }
- }
-
- private File riverDir;
- private File csv;
- private File sql;
-
- private List<DGM> dgms;
-
- public static void debug(String msg) {
- if (LOG_LEVEL >= 3) {
- System.out.println("DEBUG: " + msg);
- }
- }
-
- public static void info(String msg) {
- if (LOG_LEVEL >= 2) {
- System.out.println("INFO: " + msg);
- }
- }
-
- public static void warn(String msg) {
- if (LOG_LEVEL >= 1) {
- System.out.println("WARN: " + msg);
- }
- }
-
- public static void error(String msg) {
- System.out.println("ERROR: " + msg);
- }
-
- public static File getRiverDir(String[] args) {
- if (GEW_DIR != null && GEW_DIR.length() > 0) {
- return new File(GEW_DIR);
- }
- else if (args != null && args.length > 0) {
- return new File(args[0]);
- }
-
- return null;
- }
-
- public static File getCSVFile(String[] args) {
- if (CSV_FILE != null && CSV_FILE.length() > 0) {
- return new File(CSV_FILE);
- }
- else if (args != null && args.length > 1) {
- return new File(args[1]);
- }
-
- return null;
- }
-
- public static File getSQLFile(String[] args) {
- if (SQL_FILE != null && SQL_FILE.length() > 0) {
- return new File(SQL_FILE);
- }
- else if (args != null && args.length > 2) {
- return new File(args[2]);
- }
-
- return null;
- }
-
- public static void main(String[] args) {
- info("Start convering CSV -> SQL statements");
-
- if (!FULL_MODE) {
- info("You are running in DEMO mode; other rivers than 'Saar', 'Mosel' and 'Elbe' are ignored.");
- }
-
- File riverDir = getRiverDir(args);
-
- if (riverDir == null) {
- warn("No rivers directory specified!");
- return;
- }
- else if (!riverDir.isDirectory()) {
- warn("Specified rivers directory is not a directory!");
- return;
- }
- else if (!riverDir.canRead()) {
- warn("Unable to read '" + riverDir.toString() + "'");
- return;
- }
-
- File csv = getCSVFile(args);
-
- if (csv == null) {
- warn("No CSV file specified!");
- return;
- }
- else if (csv.isDirectory()) {
- warn("Specified CSV file is a directory!");
- return;
- }
- else if (!csv.canRead()) {
- warn("Unable to read '" + csv.toString() + "'");
- return;
- }
-
- File sql = getSQLFile(args);
-
- if (sql == null) {
- warn("No destination file specified!");
- return;
- }
- else if (sql.isDirectory()) {
- warn("Specified destination file is a directory!");
- return;
- }
- else if (sql.exists() && !sql.canWrite()) {
- warn("Unable to write to '" + sql.toString() + "'");
- return;
- }
- else if (!sql.exists()) {
- try {
- sql.createNewFile();
- }
- catch (IOException ioe) {
- warn("Unable to write to '" + sql.toString() + "'");
- return;
- }
- }
-
- info("Start parsing CSV file '" + csv.toString() + "'");
-
- try {
- DgmSqlConverter parser = new DgmSqlConverter(riverDir, csv, sql);
- parser.read();
- parser.write();
- }
- catch (Exception e) {
- error("Unexpected error: " + e.getMessage());
- e.printStackTrace();
- }
-
- info("Finished converting CSV -> SQL regularly.");
- }
-
- public DgmSqlConverter(File riverDir, File csv, File sql) {
- this.riverDir = riverDir;
- this.csv = csv;
- this.sql = sql;
- this.dgms = new ArrayList<DGM>();
- }
-
- public void read() {
- info("Read DGM information from CSV file: " + csv.getAbsolutePath());
-
- InputStream in = null;
-
- try {
- in = new BufferedInputStream(new FileInputStream(csv));
- }
- catch (FileNotFoundException e) {
- error("File not found: " + e.getMessage());
- return;
- }
-
- Reader reader = new InputStreamReader(in);
- CSVReader csvReader = new CSVReader(reader, DEFAULT_SEPERATOR,
- DEFAULT_QUOTE);
-
- List<String[]> rows = new ArrayList<String[]>();
-
- int success = 0;
-
- try {
- rows = csvReader.readAll();
-
- for (int idx = 0; idx < rows.size(); idx++) {
- String[] row = rows.get(idx);
- if (readRow(row)) {
- success++;
- }
- else {
- warn("Unable to parse row " + (idx + 1));
- }
- }
- }
- catch (IOException e) {
- error("Error while parsing CSV: " + e.getMessage());
- return;
- }
-
- info("Parsed CSV file: " + rows.size() + " lines.");
- info("Parsed " + success + " line successful");
- }
-
- private boolean readRow(String[] row) {
- if (row == null) {
- warn("Row is null!");
- return false;
- }
-
- if (row.length < MIN_COLUMN_COUNT) {
- warn("invalid column count: " + row.length);
- return false;
- }
-
- StringBuffer rowBuffer = new StringBuffer();
- for (String col : row) {
- rowBuffer.append(col);
- rowBuffer.append(" | ");
- }
- debug(rowBuffer.toString());
-
- try {
- DGM dgm = new DGM();
- dgm.river = readRiver(row[IDX_RIVERNAME]);
- dgm.name = row[IDX_NAME];
- dgm.projection = row[IDX_PROJECTION];
- dgm.elevationState = row[IDX_ELEVATION_STATE];
- dgm.format = row[IDX_FORMAT];
- dgm.resolution = row[IDX_RESOLUTION];
- dgm.description = row[IDX_DESCRIPTION];
- dgm.lower = readLower(row[IDX_LOWER]);
- dgm.upper = readUpper(row[IDX_UPPER]);
- dgm.yearFrom = readFromYear(row[IDX_YEAR_FROM]);
- dgm.yearTo = readToYear(row[IDX_YEAR_TO]);
- dgm.borderBreak = readBorderBreak(row[IDX_BORDER_BREAK]);
- dgm.path = readPath(dgm.river, row[IDX_FILE_PATH],
- row[IDX_FILE_NAME]);
-
- dgms.add(dgm);
-
- return true;
- }
- catch (IllegalArgumentException iae) {
- warn(iae.getMessage());
- }
-
- return false;
- }
-
- private String readRiver(String rivername) throws IllegalArgumentException {
- if (rivername == null || rivername.length() == 0) {
- throw new IllegalAccessError("Invalid rivername: " + rivername);
- }
-
- if (!FULL_MODE
- && !(rivername.equals("Saar") || rivername.equals("Mosel") || rivername
- .equals("Elbe"))) {
- throw new IllegalArgumentException("In DEMO mode; skip river: "
- + rivername);
- }
-
- return rivername;
- }
-
- private Double readLower(String lower) throws IllegalArgumentException {
- try {
- return Double.valueOf(lower);
- }
- catch (NumberFormatException nfe) {
- }
-
- throw new IllegalArgumentException("Attribute 'lower' invalid: "
- + lower);
- }
-
- private Double readUpper(String upper) throws IllegalArgumentException {
- try {
- return Double.valueOf(upper);
- }
- catch (NumberFormatException nfe) {
- }
-
- throw new IllegalArgumentException("Attribute 'upper' invalid: "
- + upper);
- }
-
- private Integer readFromYear(String from) throws IllegalArgumentException {
- try {
- return Integer.valueOf(from);
- }
- catch (NumberFormatException nfe) {
- }
-
- return null;
- }
-
- private Integer readToYear(String to) throws IllegalArgumentException {
- try {
- return Integer.valueOf(to);
- }
- catch (NumberFormatException nfe) {
- }
-
- return null;
- }
-
- private String readPath(String rivername, String dir, String filename)
- throws IllegalArgumentException {
- File riverDir = new File(this.riverDir, rivername);
- File dgmDir = new File(riverDir, dir);
- File dgmFile = new File(dgmDir, filename);
-
- try {
- debug("Path of DGM = " + dgmFile.getAbsolutePath());
-
- if (dgmFile == null || !dgmFile.exists()) {
- throw new IllegalAccessError(
- "Specified DGM file does not exist: "
- + dgmFile.getAbsolutePath());
- }
-
- if (!dgmFile.isFile()) {
- throw new IllegalArgumentException(
- "Specified DGM file is no file: "
- + dgmFile.getAbsolutePath());
- }
- }
- catch (IllegalAccessError iae) {
- throw new IllegalArgumentException("Cannot find DGM file (river="
- + rivername + " | directory=" + dir + " | filename=" + filename
- + ")");
- }
-
- return dgmFile.getAbsolutePath();
- }
-
- private boolean readBorderBreak(String borderBreak) {
- if (borderBreak == null || borderBreak.length() == 0) {
- return true;
- }
- else if (borderBreak.toLowerCase().equals("ja")) {
- return true;
- }
- else if (borderBreak.toLowerCase().equals("nein")) {
- return false;
- }
- else {
- return true;
- }
- }
-
- public void write() {
- info("Write DEM information to SQL file: " + sql.getAbsolutePath());
-
- BufferedWriter bufferedWriter = null;
- try {
- bufferedWriter = new BufferedWriter(new FileWriter(sql));
-
- for (DGM dgm : dgms) {
- bufferedWriter.write(dgm.toSQL());
- bufferedWriter.newLine();
- }
- }
- catch (IOException ioe) {
- error(ioe.getMessage());
- }
- finally {
- if (bufferedWriter != null) {
- try {
- bufferedWriter.close();
- }
- catch (IOException ioe) {
- }
- }
- }
- }
-}
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/pom.xml
--- a/flys-client/pom.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/pom.xml Fri Mar 22 11:25:54 2013 +0100
@@ -227,6 +227,11 @@
<id>org.mapfish</id>
<url>http://dev.mapfish.org/maven/repository</url>
</repository>
+ <repository>
+ <id>osgeo</id>
+ <name>Open Source Geospatial Foundation Repository</name>
+ <url>http://download.osgeo.org/webdav/geotools/</url>
+ </repository>
</repositories>
</project>
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants.java Fri Mar 22 11:25:54 2013 +0100
@@ -252,6 +252,8 @@
String wqQ();
+ String wqQatGauge();
+
String wqQGauge();
String wqSingle();
@@ -284,6 +286,10 @@
String footerImpressum();
+ String projectListMin();
+
+ String projectListAdd();
+
String buttonNext();
String imageBack();
@@ -324,6 +330,10 @@
String downloadCSV();
+ String downloadAT();
+
+ String downloadWST();
+
String loadingImg();
String cancelCalculation();
@@ -416,6 +426,8 @@
// OUTPUT TYPES
+ String discharge_curve_gaugeless();
+
String discharge_curve();
String gauge_discharge_curve();
@@ -442,6 +454,8 @@
String wq_table_w();
+ String wq_waterlevel_label();
+
String wq_table_q();
String wq_value_w();
@@ -846,6 +860,8 @@
String background();
+ String discharge_tables_chart();
+
String discharge_table_nn();
String discharge_table_gauge();
@@ -946,6 +962,8 @@
String none();
+ String notselected();
+
String linetype();
String textstyle();
@@ -1134,5 +1152,16 @@
String welcome_open_or_create();
+ String official();
+
+ String inofficial();
+
+ String custom_lines();
+
+ String hws_lines();
+
+ String hws_points();
+
+
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants.properties
--- a/flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants.properties Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants.properties Fri Mar 22 11:25:54 2013 +0100
@@ -98,7 +98,10 @@
downloadPNG = images/png_export.png
downloadPDF = images/pdf_export.png
downloadSVG = images/svg_export.png
-downloadCSV = images/save.png
+downloadCSV = images/save_csv.png
+downloadAT = images/save_at.png
+downloadWST = images/save_wst.png
+loadingImg = images/loading.gif
loadingImg = images/loading.gif
cancelCalculation = images/cancelCalculation.png
markerRed = images/marker_red.png
@@ -151,6 +154,7 @@
name = Name
type = Type
wq_table_w = Characteristic Waterlevels
+wq_waterlevel_label = Characteristic Waterlevel
wq_table_q = Characteristic Discharges / Durations
wq_value_w = W [cm at Gauge]
wq_value_q = Q [m\u00b3/s]
@@ -166,6 +170,7 @@
measurementStationPanelTitle = Measurement Station Information
wqW = W at Gauge [cm]
wqQ = Q [m\u00b3/s]
+wqQatGauge = Q at Gauge [m\u00b3/s]
wqWFree = W free position [m+NHN]
wqQGauge = Discharge at Gauge
wqSingle = Single values
@@ -185,13 +190,15 @@
footerContact = Contact
footerImpressum = Legal info
+projectListMin = format-indent-less.png
+projectListAdd = list-add.png
buttonNext = Next
imageBack = images/back_en.png
imageSave = images/save.png
-theme_top = images/arrow_first.png
-theme_up = images/arrow_up.png
-theme_down = images/arrow_down.png
-theme_bottom = images/arrow_last.png
+theme_top = images/go-first.png
+theme_up = images/go-up.png
+theme_down = images/go-down.png
+theme_bottom = images/go-bottom.png
zoom_all = images/mag_100.png
zoom_in = images/mag_zoom_box.png
zoom_out = images/mag_zoom_minus.png
@@ -201,6 +208,7 @@
add = Add
discharge_curve = Discharge Curve at Gauge
+discharge_curve_gaugeless = Discharge Curve
gauge_discharge_curve = Discharge Table at Gauge
computed_discharge_curve = Discharge Curve
computed_discharge_curves = Discharge Curves
@@ -231,11 +239,11 @@
historical_discharge_export = Historical Discharge Curve Export
showextramark = Show begin of extrapolation
extreme_wq_curve = W/Q
-fix_wq_curve = W/Q
+fix_wq_curve = W/Q-Diagram
fix_deltawt_curve = \u0394 W/t
fix_longitudinal_section_curve = Longitudinal Section
fix_derivate_curve = Derivate
-fix_vollmer_wq_curve = W/Q
+fix_vollmer_wq_curve = W/Q-Diagram
datacage_add_pair = Add difference pair
load_diameter = Bedload Diameter
bed_diameter = Bed Diameter
@@ -265,7 +273,7 @@
gauge_class = Gauge Class
eventselect = Eventselection
events = Events
-kmchart = Chart
+kmchart = W/Q Preview
chart_themepanel_header_themes = Theme
chart_themepanel_header_actions = Actions
@@ -434,6 +442,7 @@
land = Land
rastermap = Rastermap
background = Background Map
+discharge_tables_chart = WQ preview
discharge_table_nn = Discharge Tables at Gauge
discharge_table_gauge = Discharge Table at Gauge
mainvalue = Mainvalue
@@ -446,6 +455,11 @@
epoch = Epoch
bedheights = Bedheights
datacage = Datacage
+official = Offical
+inofficial = Inofficiall
+custom_lines = Own Digitalizations
+hws_lines = Lines
+hws_points = Points
startcolor = Colorrange start color
endcolor = Colorrange end color
@@ -492,6 +506,8 @@
right = right
none = none
+notselected = none
+
linetype = Linetype
textstyle = Textstyle
linecolor = Linecolor
@@ -572,7 +588,7 @@
sq_overview=Overview
-gauge_zero = Gauge zero ground
+gauge_zero = GZG
gauge_q_unit = m\u00b3/s
gauge_river_info_link = Riverinfo
gauge_info_link = Gaugeinfo
@@ -580,7 +596,7 @@
gauge_river_url = https://flys-intern.intevation.de/GewaesserInfo/
gauge_curve_link = Dischargecurve/-table
discharge_timeranges = DC-Timeranges
-discharge_chart = DC-Chart
+discharge_chart = WQ-Preview
measurement_station_type = Type of Measurement Station
measurement_station_operator = Operator
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants_de.properties
--- a/flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants_de.properties Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants_de.properties Fri Mar 22 11:25:54 2013 +0100
@@ -99,7 +99,9 @@
downloadPNG = images/png_export.png
downloadPDF = images/pdf_export.png
downloadSVG = images/svg_export.png
-downloadCSV = images/save.png
+downloadCSV = images/save_csv.png
+downloadAT = images/save_at.png
+downloadWST = images/save_wst.png
loadingImg = images/loading.gif
cancelCalculation = images/cancelCalculation.png
markerRed = images/marker_red.png
@@ -138,6 +140,7 @@
riverside = Flussseite
name = Name
type = Typ
+wq_waterlevel_label = Kennzeichnender Wassterstand
wq_table_w = Kennzeichnende Wasserst\u00e4nde
wq_table_q = Kennzeichnende Abfl\u00fcsse / Dauerzahlen
wq_value_w = W [cm am Pegel]
@@ -154,6 +157,7 @@
measurementStationPanelTitle = Gew\u00e4sser/Messstellen-Info
wqW = W am Pegel [cm]
wqQ = Q [m\u00b3/s]
+wqQatGauge = Q am Pegel [m\u00b3/s]
wqWFree = W auf freier Strecke [m+NHN]
wqQGauge = Kennzeichnender Abfluss am Pegel
wqSingle = Einzelwerte
@@ -173,13 +177,15 @@
footerContact = Kontakt
footerImpressum = Impressum
+projectListMin = format-indent-less.png
+projectListAdd = list-add.png
buttonNext = \u00dcbernehmen
imageBack = images/back_de.png
imageSave = images/save.png
-theme_top = images/arrow_first.png
-theme_up = images/arrow_up.png
-theme_down = images/arrow_down.png
-theme_bottom = images/arrow_last.png
+theme_top = images/go-first.png
+theme_up = images/go-up.png
+theme_down = images/go-down.png
+theme_bottom = images/go-bottom.png
zoom_all = images/mag_100.png
zoom_in = images/mag_zoom_box.png
zoom_out = images/mag_zoom_minus.png
@@ -202,6 +208,7 @@
chartPropertiesTooltip = Diagrammeigenschaften
discharge_curve = Abflusskurve am Pegel
+discharge_curve_gaugeless = Abflusskurve
gauge_discharge_curve = Abflusstafel am Pegel
computed_discharge_curve = Abflusskurve
computed_discharge_curves = Abflusskurven
@@ -232,11 +239,11 @@
historical_discharge_export = Historische Abflusskurven Export
showextramark = Zeige Anfang der Extrapolation
extreme_wq_curve = W/Q
-fix_wq_curve = W/Q
+fix_wq_curve = W/Q-Diagram
fix_deltawt_curve = \u0394 W/t
fix_longitudinal_section_curve = L\u00e4ngsschnitt
fix_derivate_curve = Ableitungskurve
-fix_vollmer_wq_curve = W/Q
+fix_vollmer_wq_curve = W/Q-Diagramm
datacage_add_pair = Differenzenpaar hinzuf\u00fcgen
load_diameter = Geschiebedurchmesser
bed_diameter = Sohldurchmesser
@@ -264,7 +271,7 @@
gauge_class = Abflussklasse
eventselect = Ereignisauswahl
events = Ereignisse
-kmchart = Diagramm
+kmchart = W/Q Vorschau
exportATTooltip = Daten als AT Datei exportieren
@@ -436,6 +443,7 @@
land = Land
rastermap = Rasterkarte
background = Hintergrundkarte
+discharge_tables_chart = WQ-Vorschau
discharge_table_nn = Abflusstafeln am Pegel
discharge_table_gauge = Abflusstafel am Pegel
mainvalue = Hauptwerte
@@ -448,6 +456,11 @@
epoch = Epoche
bedheights = Sohlh\u00f6hen
datacage = Datenkorb
+official = Offiziell
+inofficial = Inoffiziell
+custom_lines = Eigene Digitalisierungen
+hws_lines = Liniendaten
+hws_points = Punktdaten
startcolor = Farbverlauf Startfarbe
endcolor = Farbverlauf Endfarbe
@@ -494,6 +507,8 @@
right = rechts
none = keines
+notselected = keine
+
linetype = Linientyp
textstyle = Textstil
linecolor = Linienfarbe
@@ -572,7 +587,7 @@
fix_parameters = CSV
sq_overview=\u00dcbersicht
-gauge_zero = Pegelnullpunkt
+gauge_zero = PNP
gauge_q_unit = m\u00b3/s
gauge_river_info_link = Gew\u00e4sserinfo
gauge_info_link = Pegelinfo
@@ -580,7 +595,7 @@
gauge_river_url = https://flys-intern.intevation.de/GewaesserInfo/
gauge_curve_link = Abflusskurve/-tafel
discharge_timeranges = AK-Zeitr\u00e4ume
-discharge_chart = AK-Diagramm
+discharge_chart = WQ-Vorschau
measurement_station_type = Messstellenart
measurement_station_operator = Betreiber
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants_en.properties
--- a/flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants_en.properties Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/FLYSConstants_en.properties Fri Mar 22 11:25:54 2013 +0100
@@ -100,6 +100,9 @@
downloadPDF = images/pdf_export.png
downloadSVG = images/svg_export.png
downloadCSV = images/save.png
+downloadAT = images/save_at.png
+downloadWST = images/save_wst.png
+loadingImg = images/loading.gif
loadingImg = images/loading.gif
cancelCalculation = images/cancelCalculation.png
markerRed = images/marker_red.png
@@ -151,6 +154,7 @@
top_edge = Top edge
name = Name
type = Type
+wq_waterlevel_label = Characteristic Waterlevel
wq_table_w = Characteristic Waterlevels
wq_table_q = Characteristic Discharges / Durations
wq_value_w = W [cm at Gauge]
@@ -167,6 +171,7 @@
measurementStationPanelTitle = Measurement Station Information
wqW = W at Gauge [cm]
wqQ = Q [m\u00b3/s]
+wqQatGauge = Q at Gauge [m\u00b3/s]
wqWFree = W at free position [m+NHN]
wqQGauge = Discharge at Gauge
wqSingle = Single values
@@ -186,13 +191,15 @@
footerContact = Contact
footerImpressum = Legal info
+projectListMin = format-indent-less.png
+projectListAdd = list-add.png
buttonNext = Next
imageBack = images/back_en.png
imageSave = images/save.png
-theme_top = images/arrow_first.png
-theme_up = images/arrow_up.png
-theme_down = images/arrow_down.png
-theme_bottom = images/arrow_last.png
+theme_top = images/go-first.png
+theme_up = images/go-up.png
+theme_down = images/go-down.png
+theme_bottom = images/go-bottom.png
zoom_all = images/mag_100.png
zoom_in = images/mag_zoom_box.png
zoom_out = images/mag_zoom_minus.png
@@ -202,6 +209,7 @@
add = Add
discharge_curve = Discharge Curve at Gauge
+discharge_curve_gaugeless = Discharge Curve
gauge_discharge_curve = Discharge Table at Gauge
computed_discharge_curve = Discharge Curve
computed_discharge_curves = Discharge Curves
@@ -232,11 +240,11 @@
historical_discharge_export = Historical Discharge Curve Export
showextramark = Show begin of extrapolation
extreme_wq_curve = W/Q
-fix_wq_curve = W/Q
+fix_wq_curve = W/Q-Diagram
fix_deltawt_curve = \u0394 W/t
fix_longitudinal_section_curve = Longitudinal Section
fix_derivate_curve = Derivate
-fix_vollmer_wq_curve = W/Q
+fix_vollmer_wq_curve = W/Q-Diagram
datacage_add_pair = Add difference pair
load_diameter = Bedload Diameter
bed_diameter = Bed Diameter
@@ -266,7 +274,7 @@
gauge_class = Gauge Class
eventselect = Eventselection
events = Events
-kmchart = Chart
+kmchart = W/Q Preview
chart_themepanel_header_themes = Theme
chart_themepanel_header_actions = Actions
@@ -436,6 +444,7 @@
land = Land
rastermap = Rastermap
background = Background Layer
+discharge_tables_chart = WQ preview
discharge_table_nn = Discharge Tables at Gauge
discharge_table_gauge = Discharge Table at Gauge
mainvalue = Mainvalue
@@ -494,6 +503,8 @@
right = right
none = none
+notselected = none
+
linetype = Linetype
textstyle = Textstyle
linecolor = Linecolor
@@ -572,7 +583,7 @@
fix_parameters = CSV
sq_overview=Overview
-gauge_zero = Gauge zero ground
+gauge_zero = GZG
gauge_q_unit = m\u00b3/s
gauge_river_info_link = Riverinfo
gauge_info_link = Gaugeinfo
@@ -580,7 +591,7 @@
gauge_river_url = https://flys-intern.intevation.de/GewaesserInfo/
gauge_curve_link = Dischargecurve/-table
discharge_timeranges = DC-Timeranges
-discharge_chart = DC-Chart
+discharge_chart = WQ-Preview
measurement_station_type = Type of Measurement Station
measurement_station_operator = Operator
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/AbstractUIProvider.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/AbstractUIProvider.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/AbstractUIProvider.java Fri Mar 22 11:25:54 2013 +0100
@@ -41,25 +41,25 @@
{
private static final long serialVersionUID = -1610874613377494184L;
- /** The message class that provides i18n strings.*/
+ /** The message class that provides i18n strings. */
protected FLYSConstants MSG = GWT.create(FLYSConstants.class);
- /** The StepForwardHandlers.*/
+ /** The StepForwardHandlers. */
protected List<StepForwardHandler> forwardHandlers;
- /** The StepForwardHandlers.*/
+ /** The StepForwardHandlers. */
protected List<StepBackHandler> backHandlers;
- /** The container that is used to position helper widgets.*/
+ /** The container that is used to position helper widgets. */
protected VLayout helperContainer;
- /** The artifact that contains status information.*/
+ /** The artifact that contains status information. */
protected Artifact artifact;
- /** The Collection.*/
+ /** The Collection. */
protected Collection collection;
- /** The ParameterList.*/
+ /** The ParameterList. */
protected ParameterList parameterList;
/**
@@ -266,6 +266,9 @@
DataList[] old = desc.getOldData();
for (DataList list: old) {
+ if (list == null) {
+ continue;
+ }
Data d = getData(list.getAll(), name);
if (d != null) {
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/DigitizePanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/DigitizePanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/DigitizePanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -1,22 +1,21 @@
package de.intevation.flys.client.client.ui;
+import java.util.List;
+
+import org.gwtopenmaps.openlayers.client.Map;
+import org.gwtopenmaps.openlayers.client.control.Attribution;
+import org.gwtopenmaps.openlayers.client.layer.WMS;
+import org.gwtopenmaps.openlayers.client.layer.WMSOptions;
+import org.gwtopenmaps.openlayers.client.layer.WMSParams;
+
import com.google.gwt.core.client.GWT;
import com.google.gwt.user.client.rpc.AsyncCallback;
-
-import com.smartgwt.client.types.Encoding;
import com.smartgwt.client.types.VerticalAlignment;
import com.smartgwt.client.util.SC;
-import com.smartgwt.client.widgets.Button;
import com.smartgwt.client.widgets.Canvas;
-import com.smartgwt.client.widgets.HTMLPane;
import com.smartgwt.client.widgets.Label;
-import com.smartgwt.client.widgets.events.ClickEvent;
-import com.smartgwt.client.widgets.events.ClickHandler;
import com.smartgwt.client.widgets.events.VisibilityChangedEvent;
import com.smartgwt.client.widgets.events.VisibilityChangedHandler;
-import com.smartgwt.client.widgets.form.DynamicForm;
-import com.smartgwt.client.widgets.form.fields.SelectItem;
-import com.smartgwt.client.widgets.form.fields.UploadItem;
import com.smartgwt.client.widgets.layout.VLayout;
import com.smartgwt.client.widgets.tab.events.TabSelectedEvent;
import com.smartgwt.client.widgets.tab.events.TabSelectedHandler;
@@ -35,17 +34,6 @@
import de.intevation.flys.client.shared.model.DefaultDataItem;
import de.intevation.flys.client.shared.model.MapInfo;
-import java.util.LinkedHashMap;
-import java.util.List;
-
-import org.gwtopenmaps.openlayers.client.Map;
-import org.gwtopenmaps.openlayers.client.control.Attribution;
-import org.gwtopenmaps.openlayers.client.feature.VectorFeature;
-import org.gwtopenmaps.openlayers.client.format.GeoJSON;
-import org.gwtopenmaps.openlayers.client.layer.WMS;
-import org.gwtopenmaps.openlayers.client.layer.WMSOptions;
-import org.gwtopenmaps.openlayers.client.layer.WMSParams;
-
public class DigitizePanel
extends SelectProvider
@@ -71,37 +59,17 @@
@Override
public Canvas create(DataList list) {
List<Data> data = list.getAll();
-
helperContainer.addVisibilityChangedHandler(this);
- Data barriers = null;
- for (int i = data.size()-1; i >= 0; i--) {
- Data d = data.get(i);
- if (d.getLabel().equals(UESK_BARRIERS)) {
- barriers = d;
- data.remove(d);
- }
- }
-
DataList clone = (DataList) list.clone();
List<Data> all = clone.getAll();
all.remove(UESK_BARRIERS);
- Canvas selectBox = super.create(clone);
+ Canvas widget = createWidget(list);
final Config cfg = Config.getInstance();
final String locale = cfg.getLocale();
- DataItem[] obj = barriers.getItems();
-
- final String[] geojson = new String[1];
- for (DataItem item: obj) {
- if (item.getLabel().equals(UESK_BARRIERS)) {
- geojson[0] = item.getStringValue();
- break;
- }
- }
-
String river = getDataValue("state.winfo.river", "river");
mapInfo.getMapInfo(locale, river, new AsyncCallback<MapInfo>() {
@Override
@@ -114,11 +82,11 @@
@Override
public void onSuccess(MapInfo info) {
- createMapWidget(info, geojson[0]);
+ createMapWidget(info);
}
});
- return selectBox;
+ return widget;
}
@@ -137,10 +105,6 @@
layout.setAlign(VerticalAlignment.TOP);
layout.setHeight(25);
- LinkedHashMap<String, String> initial = new LinkedHashMap<String, String>();
-
- form = new DynamicForm();
-
int size = data.size();
for (int i = 0; i < size; i++) {
@@ -151,74 +115,10 @@
label.setHeight(20);
label.setWidth(400);
- SelectItem combobox = new SelectItem(d.getLabel());
- combobox.setWidth(250);
-
- LinkedHashMap<String, String> it = new LinkedHashMap<String, String>();
-
- boolean defaultSet = false;
- boolean first = true;
-
- DataItem def = d.getDefault();
- String defValue = def != null ? def.getStringValue() : null;
-
- if (defValue != null && defValue.length() > 0) {
- initial.put(d.getLabel(), def.getStringValue());
- defaultSet = true;
- }
-
- for (DataItem item: d.getItems()) {
- if (!defaultSet && first) {
- initial.put(d.getLabel(), item.getStringValue());
- first = false;
- }
-
- it.put(item.getStringValue(), item.getLabel());
- }
-
- label.setWidth(50);
- combobox.setValueMap(it);
- combobox.setShowTitle(false);
- form.setItems(combobox);
-
- HTMLPane uploadTargetFrame = new HTMLPane();
- uploadTargetFrame.setWidth("200px");
- uploadTargetFrame.setHeight("50px");
- uploadTargetFrame.setContents(
- "<iframe id='uploadTarget' name='uploadTarget' scrolling='no' width=200 height=50 style='border: 0px'></iframe>");
- uploadTargetFrame.setBorder("0px");
- uploadTargetFrame.setScrollbarSize(0);
-
- final DynamicForm uploadForm = new DynamicForm();
- uploadForm.setAction("flys/fileupload?uuid=" + artifact.getUuid());
- uploadForm.setTarget("uploadTarget");
- uploadForm.setEncoding(Encoding.MULTIPART);
- Label uploadLabel = new Label(MSG.shape_file_upload());
- uploadLabel.setHeight(20);
- UploadItem uploadItem = new UploadItem();
- uploadItem.setShowTitle(false);
- uploadForm.setFields(uploadItem);
- Button submit = new Button(MSG.upload_file());
- submit.addClickHandler(new ClickHandler() {
- @Override
- public void onClick(ClickEvent e) {
- uploadForm.submitForm();
- }
- });
-
layout.addMember(label);
- layout.addMember(form);
- layout.addMember(uploadLabel);
- layout.addMember(uploadForm);
- layout.addMember(submit);
layout.addMember(getNextButton());
-
- layout.setMembersMargin(10);
- layout.addMember(uploadTargetFrame);
}
- form.setValues(initial);
-
layout.setAlign(VerticalAlignment.TOP);
return layout;
@@ -227,14 +127,12 @@
@Override
protected Data[] getData() {
- Data[] data = super.getData();
- Data[] total = new Data[2];
+ Data[] total = new Data[1];
if (floodMap != null) {
DataItem item = new DefaultDataItem(
UESK_BARRIERS, UESK_BARRIERS, floodMap.getFeaturesAsGeoJSON());
- total[0] = data[0];
- total[1] = new DefaultData(
+ total[0] = new DefaultData(
UESK_BARRIERS, null, null, new DataItem[] { item });
}
else {
@@ -246,7 +144,7 @@
}
- public void createMapWidget(MapInfo mapInfo, String geojson) {
+ public void createMapWidget(MapInfo mapInfo) {
mapPanel = new MapPanel(mapInfo, true);
floodMap = mapPanel.getFloodMap();
@@ -266,9 +164,25 @@
map.addLayer(back);
map.addLayer(axis);
- if (geojson != null && geojson.length() > 0) {
- VectorFeature[] features = new GeoJSON().read(geojson);
- floodMap.getBarrierLayer().addFeatures(features);
+ String hws = getDataValue("state.winfo.uesk.dc-hws", "uesk.hws");
+ if (hws != null && hws.length() > 0) {
+ WMS hwsLayer = getLayer(
+ //TODO: Use Mapinfo to get hws layer infos.
+ mapInfo.getWmsUrl().replace("river", "user"),
+ "ms_layer-hws-lines" + artifact.getUuid(),
+ mapInfo.getProjection(),
+ false);
+ map.addLayer(hwsLayer);
+ }
+ String userRgd = getDataValue("state.winfo.uesk.user-rgd", "uesk.user-rgd");
+ if (userRgd != null && userRgd.length() > 0) {
+ WMS userLayer = getLayer(
+ //TODO: Use Mapinfo to get hws layer infos.
+ mapInfo.getWmsUrl().replace("river", "user"),
+ "ms_layer-user-rgd" + artifact.getUuid(),
+ mapInfo.getProjection(),
+ false);
+ map.addLayer(userLayer);
}
map.addControl(new Attribution());
map.zoomToMaxExtent();
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/DoubleArrayPanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/DoubleArrayPanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/DoubleArrayPanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -9,6 +9,7 @@
import com.smartgwt.client.widgets.form.fields.StaticTextItem;
import com.smartgwt.client.widgets.form.fields.TextItem;
import com.smartgwt.client.widgets.form.fields.events.BlurHandler;
+import com.smartgwt.client.widgets.form.fields.events.FocusHandler;
import de.intevation.flys.client.client.FLYSConstants;
@@ -22,6 +23,8 @@
protected TextItem ti;
+ private String title;
+
/** The constant input field name. */
public static final String FIELD_NAME = "doublearray";
@@ -31,7 +34,7 @@
double[] values,
BlurHandler handler)
{
- this(title, values, handler, TitleOrientation.RIGHT);
+ this(title, values, handler, null, TitleOrientation.RIGHT);
}
@@ -42,14 +45,17 @@
* @param name The name of the TextItem.
* @param title The title of the TextItem.
* @param values The double values that should be displayed initially.
- * @param handler The BlurHandler that is used to valide the input.
+ * @param blurHandler The BlurHandler that is used to valide the input.
+ * @param focusHandler The FocueHandler that is used to valide the input.
*/
public DoubleArrayPanel(
String title,
double[] values,
- BlurHandler handler,
+ BlurHandler blurHandler,
+ FocusHandler focusHandler,
TitleOrientation titleOrientation)
{
+ this.title = title;
ti = new TextItem(FIELD_NAME);
StaticTextItem sti = new StaticTextItem("staticarray");
@@ -57,7 +63,10 @@
sti.setShowTitle(false);
sti.setValue(title);
- ti.addBlurHandler(handler);
+ ti.addBlurHandler(blurHandler);
+ if (focusHandler != null) {
+ ti.addFocusHandler(focusHandler);
+ }
if (titleOrientation == TitleOrientation.RIGHT) {
setFields(ti, sti);
@@ -263,5 +272,9 @@
public double[] getInputValues() {
return getInputValues(ti);
}
+
+ public String getItemTitle() {
+ return this.title;
+ }
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/ExportPanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/ExportPanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/ExportPanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -101,9 +101,23 @@
String filename
) {
String url = getExportUrl(name, facet, filename);
- String iUrl = GWT.getHostPageBaseURL() + MSG.imageSave();
-
- ImgLink link = new ImgLink(iUrl, url, 30, 30);
+ String imgUrl = GWT.getHostPageBaseURL();
+ if (facet.equals("pdf")) {
+ imgUrl += MSG.downloadPDF();
+ }
+ else if (facet.equals("at")) {
+ imgUrl += MSG.downloadAT();
+ }
+ else if (facet.equals("wst")) {
+ imgUrl += MSG.downloadWST();
+ }
+ else if (facet.equals("csv")) {
+ imgUrl += MSG.downloadCSV();
+ }
+ else {
+ imgUrl += MSG.imageSave();
+ }
+ ImgLink link = new ImgLink(imgUrl, url, 30, 30);
link.setTooltip(getTooltipText(name, facet));
return link;
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/GaugeTimeRangePanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/GaugeTimeRangePanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/GaugeTimeRangePanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -284,7 +284,7 @@
};
- ListGridField addstart = new ListGridField ("", "");
+ ListGridField addstart = new ListGridField ("", MESSAGES.from());
addstart.setType (ListGridFieldType.ICON);
addstart.setWidth (20);
addstart.setCellIcon(baseUrl + MESSAGES.markerGreen());
@@ -301,7 +301,7 @@
}
});
- ListGridField addend = new ListGridField ("", "");
+ ListGridField addend = new ListGridField ("", MESSAGES.to());
addend.setType (ListGridFieldType.ICON);
addend.setWidth (20);
addend.setCellIcon(baseUrl + MESSAGES.markerRed());
@@ -319,7 +319,7 @@
});
ListGridField desc =
- new ListGridField("description", MESSAGES.description());
+ new ListGridField("description", MESSAGES.discharge_curve_gaugeless());
desc.setType(ListGridFieldType.TEXT);
desc.setWidth("*");
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/HWSDatacagePanel.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/HWSDatacagePanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,129 @@
+package de.intevation.flys.client.client.ui;
+
+import java.util.ArrayList;
+import java.util.List;
+
+import com.google.gwt.core.client.GWT;
+import com.smartgwt.client.widgets.Canvas;
+import com.smartgwt.client.widgets.Label;
+import com.smartgwt.client.widgets.grid.ListGridRecord;
+import com.smartgwt.client.widgets.layout.HLayout;
+import com.smartgwt.client.widgets.layout.VLayout;
+
+import de.intevation.flys.client.shared.model.Data;
+import de.intevation.flys.client.shared.model.DataItem;
+import de.intevation.flys.client.shared.model.DataList;
+import de.intevation.flys.client.shared.model.DefaultData;
+import de.intevation.flys.client.shared.model.DefaultDataItem;
+import de.intevation.flys.client.shared.model.Recommendation;
+import de.intevation.flys.client.shared.model.ToLoad;
+import de.intevation.flys.client.shared.model.User;
+
+
+public class HWSDatacagePanel
+extends DatacagePanel
+{
+ public static final String OUT = "floodmap-hws";
+ public static final String PARAMETERS = "hws:true;load-system:true";
+
+
+ public HWSDatacagePanel() {
+ super();
+ }
+
+
+ public HWSDatacagePanel(User user) {
+ super(user);
+ }
+
+
+ @Override
+ protected void createWidget() {
+ super.createWidget();
+ widget.setIsMutliSelectable(true);
+ }
+
+
+ @Override
+ public String getOuts() {
+ return OUT;
+ }
+
+
+ @Override
+ public String getParameters() {
+ return PARAMETERS;
+ }
+
+
+ @Override
+ public List<String> validate() {
+ List<String> errors = new ArrayList<String>();
+
+ return errors;
+ }
+
+ @Override
+ public Canvas createOld(DataList dataList) {
+ GWT.log("old datacage##########################################");
+ HLayout layout = new HLayout();
+ VLayout vLayout = new VLayout();
+ layout.setWidth("400px");
+
+ Label label = new Label(dataList.getLabel());
+ label.setWidth("200px");
+
+ int size = dataList.size();
+ for (int i = 0; i < size; i++) {
+ Data data = dataList.get(i);
+ DataItem[] items = data.getItems();
+
+ for (DataItem item: items) {
+ HLayout hLayout = new HLayout();
+
+ hLayout.addMember(label);
+ hLayout.addMember(new Label(item.getLabel()));
+
+ vLayout.addMember(hLayout);
+ vLayout.setWidth("130px");
+ }
+ }
+
+ Canvas back = getBackButton(dataList.getState());
+
+ layout.addMember(label);
+ layout.addMember(vLayout);
+ layout.addMember(back);
+
+ return layout;
+ }
+
+
+ @Override
+ protected Data[] getData() {
+ String[] selection = this.widget.getSelectionTitles();
+ String result = "";
+ boolean first = true;
+ if (selection != null) {
+ for (String record: selection) {
+ if (first) {
+ result += record;
+ first = false;
+ }
+ else {
+ result += ";" + record;
+ }
+ }
+ }
+ if (result.length() == 0) {
+ result = MSG.notselected();
+ }
+ Data[] data = new Data[1];
+ DataItem item = new DefaultDataItem(
+ "uesk.hws", "uesk.hws", result);
+ data[0] = new DefaultData("uesk.hws", null, null, new DataItem[] {item});
+
+ return data;
+ }
+}
+// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
\ No newline at end of file
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/LocationDistancePanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/LocationDistancePanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/LocationDistancePanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -1205,7 +1205,7 @@
inputTables.updateTab(0, null);
inputTables.removeTab(0);
- //Create a new tab containing the locationDistanceTable.
+ // Create a new tab containing the locationDistanceTable.
Tab t1 = new Tab(MESSAGES.locations());
t1.setPane(locationDistanceTable);
inputTables.addTab(t1, 0);
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/ProjectList.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/ProjectList.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/ProjectList.java Fri Mar 22 11:25:54 2013 +0100
@@ -1,11 +1,9 @@
package de.intevation.flys.client.client.ui;
import com.google.gwt.core.client.GWT;
-import com.google.gwt.event.dom.client.ClickEvent;
import com.google.gwt.i18n.client.DateTimeFormat;
import com.google.gwt.user.client.Timer;
import com.google.gwt.user.client.rpc.AsyncCallback;
-import com.google.gwt.user.client.ui.Button;
import com.smartgwt.client.types.Alignment;
import com.smartgwt.client.types.Autofit;
@@ -19,7 +17,9 @@
import com.smartgwt.client.util.BooleanCallback;
import com.smartgwt.client.util.SC;
import com.smartgwt.client.widgets.Canvas;
+import com.smartgwt.client.widgets.IconButton;
import com.smartgwt.client.widgets.Label;
+import com.smartgwt.client.widgets.events.ClickEvent;
import com.smartgwt.client.widgets.events.VisibilityChangedEvent;
import com.smartgwt.client.widgets.events.VisibilityChangedHandler;
import com.smartgwt.client.widgets.grid.CellFormatter;
@@ -383,14 +383,14 @@
HLayout buttonWrapper = new HLayout();
- Button addButton = new Button("+");
- addButton.setStyleName("projectsAddButton");
- addButton.setTitle(messages.new_project());
+ IconButton addButton = new IconButton("");
+ addButton.setIcon(messages.projectListAdd());
+ addButton.setTooltip(messages.new_project());
addButton.setWidth("30px");
- Button closeButton = new Button("X");
- closeButton.setStyleName("projectsCloseButton");
- closeButton.setTitle(messages.projectlist_close());
+ IconButton closeButton = new IconButton("");
+ closeButton.setIcon(messages.projectListMin());
+ closeButton.setTooltip(messages.projectlist_close());
closeButton.setWidth("30px");
buttonWrapper.addMember(addButton);
@@ -420,7 +420,7 @@
addMember(filterpanel);
addButton.addClickHandler(
- new com.google.gwt.event.dom.client.ClickHandler() {
+ new com.smartgwt.client.widgets.events.ClickHandler() {
@Override
public void onClick(ClickEvent ev) {
@@ -429,7 +429,7 @@
});
closeButton.addClickHandler(
- new com.google.gwt.event.dom.client.ClickHandler() {
+ new com.smartgwt.client.widgets.events.ClickHandler() {
@Override
public void onClick(ClickEvent ev) {
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/QSegmentedInputPanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/QSegmentedInputPanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/QSegmentedInputPanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -180,7 +180,7 @@
HLayout h = new HLayout();
String[] parts = gauge.split(GAUGE_PART_SEPARATOR);
- String[] values = parts[2].split(VALUE_SEPARATOR);
+ String[] values = parts[3].split(VALUE_SEPARATOR);
Label l = new Label(parts[0] + " - " + parts[1] + ": ");
@@ -297,7 +297,7 @@
String title = item.getLabel();
DoubleArrayPanel dap = new DoubleArrayPanel(
- createLineTitle(title), null, this, TitleOrientation.LEFT);
+ createLineTitle(title), null, this, null, TitleOrientation.LEFT);
wqranges.put(title, dap);
@@ -361,10 +361,10 @@
double[] values = dap.getInputValues();
if (wqvalue == null) {
- wqvalue = createValueString(key, values);
+ wqvalue = createValueString(key + "; ", values);
}
else {
- wqvalue += GAUGE_SEPARATOR + createValueString(key, values);
+ wqvalue += GAUGE_SEPARATOR + createValueString(key + "; ", values);
}
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/UIProviderFactory.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/UIProviderFactory.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/UIProviderFactory.java Fri Mar 22 11:25:54 2013 +0100
@@ -1,5 +1,7 @@
package de.intevation.flys.client.client.ui;
+import com.google.gwt.core.client.GWT;
+
import de.intevation.flys.client.client.ui.fixation.FixEventSelect;
import de.intevation.flys.client.client.ui.fixation.FixFunctionSelect;
import de.intevation.flys.client.client.ui.fixation.FixGaugeSelectPanel;
@@ -174,6 +176,12 @@
else if (uiProvider.equals("minfo.sedimentload_offepoch_select")) {
return new SedLoadOffEpochPanel();
}
+ else if (uiProvider.equals("hws_datacage_panel")) {
+ return new HWSDatacagePanel(user);
+ }
+ else if (uiProvider.equals("user_rgd_panel")) {
+ return new UserRGDProvider();
+ }
else {
//GWT.log("Picked default provider.");
return new SelectProvider();
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/UserRGDProvider.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/UserRGDProvider.java Fri Mar 22 11:25:54 2013 +0100
@@ -0,0 +1,133 @@
+package de.intevation.flys.client.client.ui;
+
+import java.util.List;
+
+import com.google.gwt.core.client.GWT;
+import com.smartgwt.client.types.Encoding;
+import com.smartgwt.client.types.VerticalAlignment;
+import com.smartgwt.client.widgets.Button;
+import com.smartgwt.client.widgets.Canvas;
+import com.smartgwt.client.widgets.HTMLPane;
+import com.smartgwt.client.widgets.Label;
+import com.smartgwt.client.widgets.events.ClickEvent;
+import com.smartgwt.client.widgets.events.ClickHandler;
+import com.smartgwt.client.widgets.form.DynamicForm;
+import com.smartgwt.client.widgets.form.fields.UploadItem;
+import com.smartgwt.client.widgets.layout.VLayout;
+
+import de.intevation.flys.client.shared.model.Data;
+import de.intevation.flys.client.shared.model.DataItem;
+import de.intevation.flys.client.shared.model.DataList;
+import de.intevation.flys.client.shared.model.DefaultData;
+import de.intevation.flys.client.shared.model.DefaultDataItem;
+
+
+public class UserRGDProvider
+extends SelectProvider
+{
+
+ private HTMLPane uploadTargetFrame;
+ private String uploadFile;
+
+ public UserRGDProvider() {
+ uploadTargetFrame = new HTMLPane();
+ }
+
+ @Override
+ public Canvas create(DataList list) {
+ List<Data> data = list.getAll();
+
+ //Canvas selectBox = super.create(clone);
+ Canvas widget = createWidget(list);
+
+ return widget;
+ }
+
+
+ /**
+ * This method creates the content of the widget.
+ *
+ * @param data The {@link DataList} object.
+ *
+ * @return a combobox.
+ */
+ @Override
+ protected Canvas createWidget(DataList data) {
+ GWT.log("DigitizePanel - createWidget()");
+
+ VLayout layout = new VLayout();
+ layout.setAlign(VerticalAlignment.TOP);
+ layout.setHeight(25);
+
+ int size = data.size();
+
+ for (int i = 0; i < size; i++) {
+ Data d = data.get(i);
+
+ Label label = new Label(d.getDescription());
+ label.setValign(VerticalAlignment.TOP);
+ label.setHeight(20);
+ label.setWidth(400);
+
+ uploadTargetFrame.setWidth("200px");
+ uploadTargetFrame.setHeight("50px");
+ uploadTargetFrame.setContents(
+ "<iframe id='uploadTarget' name='uploadTarget' scrolling='no' width=200 height=50 style='border: 0px'></iframe>");
+ uploadTargetFrame.setBorder("0px");
+ uploadTargetFrame.setScrollbarSize(0);
+
+ final DynamicForm uploadForm = new DynamicForm();
+ uploadForm.setAction("flys/fileupload?uuid=" + artifact.getUuid());
+ uploadForm.setTarget("uploadTarget");
+ uploadForm.setEncoding(Encoding.MULTIPART);
+ Label uploadLabel = new Label(MSG.shape_file_upload());
+ uploadLabel.setHeight(20);
+ final UploadItem uploadItem = new UploadItem();
+ uploadItem.setShowTitle(false);
+ uploadForm.setFields(uploadItem);
+ Button submit = new Button(MSG.upload_file());
+ submit.addClickHandler(new ClickHandler() {
+ @Override
+ public void onClick(ClickEvent e) {
+ uploadFile = uploadItem.getValueAsString();
+ uploadForm.submitForm();
+ }
+ });
+
+ layout.addMember(label);
+ layout.addMember(form);
+ layout.addMember(uploadLabel);
+ layout.addMember(uploadForm);
+ layout.addMember(submit);
+ layout.addMember(getNextButton());
+
+ layout.setMembersMargin(10);
+ layout.addMember(uploadTargetFrame);
+ }
+
+ layout.setAlign(VerticalAlignment.TOP);
+
+ return layout;
+ }
+
+ @Override
+ protected Data[] getData() {
+ Data[] total = new Data[1];
+
+ if (uploadFile != null && uploadFile.length() > 0) {
+ DataItem item = new DefaultDataItem(
+ "uesk.user-rgd", "uesk.user-rgd", uploadFile);
+ total[0] = new DefaultData(
+ "uesk.user-rgd", null, null, new DataItem[] { item });
+ }
+ else {
+ // Happens when OpenLayers is missing
+ DataItem item = new DefaultDataItem(
+ "uesk.user-rgd", "uesk.user-rgd", MSG.notselected());
+ total[0] = new DefaultData(
+ "uesk.user-rgd", null, null, new DataItem[] { item });
+ }
+
+ return total;
+ }
+}
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/WQAdaptedInputPanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/WQAdaptedInputPanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/WQAdaptedInputPanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -4,17 +4,23 @@
import com.google.gwt.i18n.client.NumberFormat;
import com.google.gwt.user.client.rpc.AsyncCallback;
+import com.smartgwt.client.data.Record;
import com.smartgwt.client.types.TitleOrientation;
import com.smartgwt.client.types.VerticalAlignment;
import com.smartgwt.client.util.SC;
import com.smartgwt.client.widgets.Canvas;
import com.smartgwt.client.widgets.Label;
import com.smartgwt.client.widgets.form.DynamicForm;
+import com.smartgwt.client.widgets.form.fields.FormItem;
import com.smartgwt.client.widgets.form.fields.RadioGroupItem;
import com.smartgwt.client.widgets.form.fields.events.BlurEvent;
import com.smartgwt.client.widgets.form.fields.events.BlurHandler;
import com.smartgwt.client.widgets.form.fields.events.ChangeEvent;
import com.smartgwt.client.widgets.form.fields.events.ChangeHandler;
+import com.smartgwt.client.widgets.form.fields.events.FocusEvent;
+import com.smartgwt.client.widgets.form.fields.events.FocusHandler;
+import com.smartgwt.client.widgets.grid.events.CellClickEvent;
+import com.smartgwt.client.widgets.grid.events.CellClickHandler;
import com.smartgwt.client.widgets.layout.HLayout;
import com.smartgwt.client.widgets.layout.VLayout;
import com.smartgwt.client.widgets.tab.Tab;
@@ -52,10 +58,13 @@
*/
public class WQAdaptedInputPanel
extends AbstractUIProvider
-implements ChangeHandler, BlurHandler
+implements ChangeHandler, BlurHandler, FocusHandler
{
private static final long serialVersionUID = -3218827566805476423L;
+ /** The message class that provides i18n strings.*/
+ protected FLYSConstants MESSAGE = GWT.create(FLYSConstants.class);
+
public static final String FIELD_WQ_MODE = "wq_isq";
public static final String FIELD_WQ_W = "W";
public static final String FIELD_WQ_Q = "Q";
@@ -95,12 +104,18 @@
/** The RadioGroupItem that determines the w/q input mode.*/
protected DynamicForm modes;
+ /** Table holding Q and D values. */
protected QDTable qdTable;
+ /** Table holding W values. */
protected WTable wTable;
+ /** Tabs in inputhelper area. */
protected TabSet tabs;
+ /** The currently focussed Input element. */
+ protected DoubleArrayPanel itemWithFocus;
+
public WQAdaptedInputPanel() {
wqranges = new HashMap<String, DoubleArrayPanel>();
@@ -108,6 +123,7 @@
wranges = new HashMap<String, double[]>();
qdTable = new QDTable();
wTable = new WTable();
+ initTableListeners();
}
@@ -129,23 +145,23 @@
layout.addMember(widget);
layout.addMember(submit);
-
return layout;
}
+ /** Inits the helper panel. */
+ // TODO duplicate in WQInputPanel
protected void initHelperPanel() {
tabs = new TabSet();
tabs.setWidth100();
tabs.setHeight100();
- // TODO i18n
- Tab wTab = new Tab("W");
- Tab qTab = new Tab("Q / D");
+ Tab wTab = new Tab(MESSAGE.wq_table_w());
+ Tab qTab = new Tab(MESSAGE.wq_table_q());
+ qdTable.showSelect();
wTab.setPane(wTable);
qTab.setPane(qdTable);
- qdTable.hideIconFields();
tabs.addTab(wTab, 0);
tabs.addTab(qTab, 1);
@@ -156,12 +172,37 @@
}
+ /**
+ * Initializes the listeners of the WQD tables.
+ */
+ // TODO dupe from WQInputPanel
+ protected void initTableListeners() {
+ CellClickHandler handler = new CellClickHandler() {
+ @Override
+ public void onCellClick(CellClickEvent e) {
+ if (isWMode() || qdTable.isLocked()) {
+ return;
+ }
+
+ int idx = e.getColNum();
+ Record r = e.getRecord();
+ double val = r.getAttributeAsDouble("value");
+
+ if (itemWithFocus != null) {
+ itemWithFocus.setValues(new double[]{val});
+ }
+ }
+ };
+
+ qdTable.addCellClickHandler(handler);
+ }
+
@Override
public Canvas createOld(DataList dataList) {
List<Data> all = dataList.getAll();
Data wqData = getData(all, "wq_values");
Data wqMode = getData(all, "wq_isq");
-
+ boolean isQ = wqMode.getItems()[0].getStringValue().equals("true");
Canvas back = getBackButton(dataList.getState());
HLayout valLayout = new HLayout();
@@ -179,7 +220,8 @@
modeLabel.setWidth(200);
valLayout.addMember(wqLabel);
- valLayout.addMember(createOldWQValues(wqData));
+ valLayout.addMember(createOldWQValues(wqData, isQ));
+
valLayout.addMember(back);
modeLayout.addMember(modeLabel);
@@ -190,7 +232,8 @@
}
- protected Canvas createOldWQValues(Data wqData) {
+ /** Create area showing previously entered w or q data. */
+ protected Canvas createOldWQValues(Data wqData, boolean isQ) {
VLayout layout = new VLayout();
DataItem item = wqData.getItems()[0];
@@ -198,13 +241,15 @@
String[] gauges = value.split(GAUGE_SEPARATOR);
+ String unit = isQ ? "m³/s" : "cm";
+
for (String gauge: gauges) {
HLayout h = new HLayout();
String[] parts = gauge.split(GAUGE_PART_SEPARATOR);
- String[] values = parts[2].split(VALUE_SEPARATOR);
+ String[] values = parts[3].split(VALUE_SEPARATOR);
- Label l = new Label(parts[0] + " - " + parts[1] + ": ");
+ Label l = new Label(parts[2] + ": ");
StringBuilder sb = new StringBuilder();
boolean first = true;
@@ -215,6 +260,8 @@
}
sb.append(v);
+ sb.append(" ");
+ sb.append(unit);
first = false;
}
@@ -268,8 +315,7 @@
}
}
-
- protected List<String> validateW() {
+ protected List<String> validateRange(Map<String, double[]> ranges) {
List<String> errors = new ArrayList<String>();
NumberFormat nf = NumberFormat.getDecimalFormat();
@@ -286,7 +332,7 @@
return errors;
}
- double[] mm = wranges.get(key);
+ double[] mm = ranges.get(key);
if (mm == null) {
SC.warn(MSG.error_read_minmax_values());
continue;
@@ -326,65 +372,17 @@
}
- protected List<String> validateQ() {
- List<String> errors = new ArrayList<String>();
- NumberFormat nf = NumberFormat.getDecimalFormat();
-
- Iterator<String> iter = wqranges.keySet().iterator();
-
- while (iter.hasNext()) {
- List<String> tmpErrors = new ArrayList<String>();
-
- String key = iter.next();
- DoubleArrayPanel dap = wqranges.get(key);
-
- if (!dap.validateForm()) {
- errors.add(MSG.error_invalid_double_value());
- return errors;
- }
-
- double[] mm = qranges.get(key);
- if (mm == null) {
- SC.warn(MSG.error_read_minmax_values());
- continue;
- }
-
- double[] values = dap.getInputValues();
- double[] good = new double[values.length];
-
- int idx = 0;
-
- for (double value: values) {
- if (value < mm[0] || value > mm[1]) {
- String tmp = MSG.error_validate_range();
- tmp = tmp.replace("$1", nf.format(value));
- tmp = tmp.replace("$2", nf.format(mm[0]));
- tmp = tmp.replace("$3", nf.format(mm[1]));
- tmpErrors.add(tmp);
- }
- else {
- good[idx++] = value;
- }
- }
-
- double[] justGood = new double[idx];
- for (int i = 0; i < justGood.length; i++) {
- justGood[i] = good[i];
- }
-
- if (!tmpErrors.isEmpty()) {
- dap.setValues(justGood);
-
- errors.addAll(tmpErrors);
- }
- }
-
- return errors;
+ protected List<String> validateW() {
+ return validateRange(wranges);
}
+ protected List<String> validateQ() {
+ return validateRange(qranges);
+ }
+
+
protected void initUserDefaults(DataList dataList) {
-
initUserWQValues(dataList);
initUserWQMode(dataList);
}
@@ -464,9 +462,9 @@
for (DataItem item: items) {
String title = item.getLabel();
-
+ String label = item.getStringValue();
DoubleArrayPanel dap = new DoubleArrayPanel(
- createLineTitle(title), null, this, TitleOrientation.LEFT);
+ label, null, this, this, TitleOrientation.LEFT);
wqranges.put(title, dap);
@@ -488,6 +486,7 @@
}
+ /** Get items which are not WQ_MODE. */
protected DataItem[] getWQItems(DataList dataList) {
List<Data> data = dataList.getAll();
@@ -505,6 +504,10 @@
}
+ /**
+ * Create radio button for switching w and q input.
+ * Radiobutton-change also triggers helper panel tab selection.
+ */
protected Canvas createMode(DataList dataList) {
RadioGroupItem wq = new RadioGroupItem(FIELD_WQ_MODE);
wq.setShowTitle(false);
@@ -513,7 +516,7 @@
LinkedHashMap wqValues = new LinkedHashMap();
wqValues.put(FIELD_WQ_W, MSG.wqW());
- wqValues.put(FIELD_WQ_Q, MSG.wqQ());
+ wqValues.put(FIELD_WQ_Q, MSG.wqQatGauge());
wq.setValueMap(wqValues);
@@ -587,13 +590,14 @@
while (iter.hasNext()) {
String key = iter.next();
DoubleArrayPanel dap = wqranges.get(key);
+ String label = dap.getItemTitle();
double[] values = dap.getInputValues();
if (wqvalue == null) {
- wqvalue = createValueString(key, values);
+ wqvalue = createValueString(key + ";" + label, values);
}
else {
- wqvalue += GAUGE_SEPARATOR + createValueString(key, values);
+ wqvalue += GAUGE_SEPARATOR + createValueString(key + ";" + label, values);
}
}
@@ -631,6 +635,13 @@
}
+ /** Store the currently focussed DoubleArrayPanel. */
+ @Override
+ public void onFocus(FocusEvent event) {
+ itemWithFocus = (DoubleArrayPanel) event.getForm();
+ }
+
+
@Override
public void onBlur(BlurEvent event) {
DoubleArrayPanel dap = (DoubleArrayPanel) event.getForm();
@@ -714,12 +725,5 @@
ArtifactDescription adesc = artifact.getArtifactDescription();
return adesc.getRiver();
}
-
-
- protected void updatePanels(boolean isQ) {
-
- }
-
-
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/WQInputPanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/WQInputPanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/WQInputPanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -276,6 +276,8 @@
}
+ /** Inits the helper panel. */
+ // TODO duplicate in WQAdaptedInputPanel
protected void initHelperPanel() {
tabs = new TabSet();
tabs.setWidth100();
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/WQSimpleArrayPanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/WQSimpleArrayPanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/WQSimpleArrayPanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -184,7 +184,7 @@
public void clickedLower(double value) {
panelW.addValue(value);
}
- }, ClickMode.SINGLE);
+ }, ClickMode.SINGLE, true);
qTable = new ClickableQDTable(new ClickableQDTable.QClickedListener() {
@@ -210,7 +210,7 @@
Tab w = new Tab(MSG.wq_table_w());
Tab q = new Tab(MSG.wq_table_q());
- Tab c = new Tab(MSG.chart());
+ Tab c = new Tab(MSG.discharge_tables_chart());
w.setPane(wTable);
q.setPane(qTable);
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/chart/ChartPropertiesEditor.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/chart/ChartPropertiesEditor.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/chart/ChartPropertiesEditor.java Fri Mar 22 11:25:54 2013 +0100
@@ -78,6 +78,7 @@
protected OutputSettings origSettings;
+
/**
* Setup editor dialog.
* @param callerTab The tab called the editor window.
@@ -172,7 +173,7 @@
/**
- *
+ * Create a section from group (usually axis properties).
*/
protected Canvas generatePropertyGroup(Property group, Property orig) {
PropertyGroup pg = (PropertyGroup)group;
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/fixation/FixEventSelect.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/fixation/FixEventSelect.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/fixation/FixEventSelect.java Fri Mar 22 11:25:54 2013 +0100
@@ -24,7 +24,7 @@
import de.intevation.flys.client.client.services.FixingsOverviewServiceAsync;
/**
- * This UIProvider creates a panel for location or distance input.
+ * This UIProvider lets you select events.
*
* @author <a href="mailto:raimund.renkert at intevation.de">Raimund Renkert</a>
*/
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/fixation/FixationPanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/fixation/FixationPanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/fixation/FixationPanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -366,6 +366,7 @@
public abstract void success();
+ /** Creates JSON string from filter. */
public static String getOverviewFilter(FixFilter filter) {
String river = filter.getRiver();
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/stationinfo/GaugePanel.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/stationinfo/GaugePanel.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/stationinfo/GaugePanel.java Fri Mar 22 11:25:54 2013 +0100
@@ -1,8 +1,5 @@
package de.intevation.flys.client.client.ui.stationinfo;
-import com.smartgwt.client.util.SC;
-import com.smartgwt.client.widgets.Label;
-
import com.google.gwt.core.client.GWT;
import com.google.gwt.user.client.rpc.AsyncCallback;
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/stationinfo/MeasurementStationRecord.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/stationinfo/MeasurementStationRecord.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/stationinfo/MeasurementStationRecord.java Fri Mar 22 11:25:54 2013 +0100
@@ -137,7 +137,7 @@
private void setOperator(String value) {
this.setAttribute("operator", value);
}
-
+
@Override
public Date getStartTime() {
return this.getAttributeAsDate("starttime");
@@ -168,7 +168,7 @@
public String getLink() {
return this.getAttributeAsString("link");
}
-
+
public void setLink(String link) {
this.setAttribute("link", link);
}
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/wq/ClickableWTable.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/wq/ClickableWTable.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/wq/ClickableWTable.java Fri Mar 22 11:25:54 2013 +0100
@@ -21,6 +21,8 @@
NONE, SINGLE, RANGE
}
+ private boolean useWaterlevelLabel = false;
+
public static interface WClickedListener {
void clickedLower(double value);
@@ -40,9 +42,10 @@
}
public ClickableWTable(WClickedListener lowerListener,
- ClickMode selectionMode) {
+ ClickMode selectionMode, boolean alternativeLabel) {
this.wClickedListener = lowerListener;
this.clickMode = selectionMode;
+ this.useWaterlevelLabel = alternativeLabel;
init();
}
@@ -56,7 +59,8 @@
setShowRecordComponentsByCell(true);
setEmptyMessage(MESSAGE.empty_table());
- ListGridField name = new ListGridField("name", MESSAGE.name());
+ ListGridField name = new ListGridField("name",
+ useWaterlevelLabel ? MESSAGE.wq_waterlevel_label() : MESSAGE.name() );
name.setType(ListGridFieldType.TEXT);
name.setWidth("*");
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/wq/QDTable.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/wq/QDTable.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/wq/QDTable.java Fri Mar 22 11:25:54 2013 +0100
@@ -109,6 +109,5 @@
public boolean isLocked() {
return lockClick;
}
-
}
// vim:set ts=4 sw=4 si et sta sts=4 fenc=utf8 :
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/client/ui/wq/WQAutoTabSet.java
--- a/flys-client/src/main/java/de/intevation/flys/client/client/ui/wq/WQAutoTabSet.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/client/ui/wq/WQAutoTabSet.java Fri Mar 22 11:25:54 2013 +0100
@@ -1,7 +1,6 @@
package de.intevation.flys.client.client.ui.wq;
import com.google.gwt.core.client.GWT;
-import com.google.gwt.i18n.client.NumberFormat;
import com.smartgwt.client.util.SC;
import com.smartgwt.client.widgets.tab.Tab;
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/server/ChartInfoServiceImpl.java
--- a/flys-client/src/main/java/de/intevation/flys/client/server/ChartInfoServiceImpl.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/server/ChartInfoServiceImpl.java Fri Mar 22 11:25:54 2013 +0100
@@ -48,7 +48,6 @@
private static final Logger logger =
Logger.getLogger(ChartInfoServiceImpl.class);
-
public static final String XPATH_TRANSFORM_MATRIX =
"/art:chartinfo/art:transformation-matrix/art:matrix";
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/java/de/intevation/flys/client/server/CollectionItemAttributeServiceImpl.java
--- a/flys-client/src/main/java/de/intevation/flys/client/server/CollectionItemAttributeServiceImpl.java Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/java/de/intevation/flys/client/server/CollectionItemAttributeServiceImpl.java Fri Mar 22 11:25:54 2013 +0100
@@ -26,7 +26,6 @@
/**
- *
* @author <a href="mailto:raimund.renkert at intevation.de">Raimund Renkert</a>
*/
public class CollectionItemAttributeServiceImpl
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/WEB-INF/features.xml
--- a/flys-client/src/main/webapp/WEB-INF/features.xml Wed Mar 06 14:14:15 2013 +0100
+++ b/flys-client/src/main/webapp/WEB-INF/features.xml Fri Mar 22 11:25:54 2013 +0100
@@ -10,6 +10,17 @@
<ftr:feature>river:Rhein</ftr:feature>
<ftr:feature>river:Mosel</ftr:feature>
<ftr:feature>river:Elbe</ftr:feature>
+ <ftr:feature>river:Donau</ftr:feature>
+ <ftr:feature>river:Fulda</ftr:feature>
+ <ftr:feature>river:Havel</ftr:feature>
+ <ftr:feature>river:Lahn</ftr:feature>
+ <ftr:feature>river:Main</ftr:feature>
+ <ftr:feature>river:Neckar</ftr:feature>
+ <ftr:feature>river:Oder</ftr:feature>
+ <ftr:feature>river:Saale</ftr:feature>
+ <ftr:feature>river:Saale-Thüringen</ftr:feature>
+ <ftr:feature>river:Werra</ftr:feature>
+ <ftr:feature>river:Weser</ftr:feature>
</ftr:role>
<ftr:role name="flys_wsa_koblenz">
<ftr:feature>module:winfo</ftr:feature>
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/arrow_down.png
Binary file flys-client/src/main/webapp/images/arrow_down.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/arrow_first.png
Binary file flys-client/src/main/webapp/images/arrow_first.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/arrow_last.png
Binary file flys-client/src/main/webapp/images/arrow_last.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/arrow_up.png
Binary file flys-client/src/main/webapp/images/arrow_up.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/format-indent-less.png
Binary file flys-client/src/main/webapp/images/format-indent-less.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/go-bottom.png
Binary file flys-client/src/main/webapp/images/go-bottom.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/go-down.png
Binary file flys-client/src/main/webapp/images/go-down.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/go-first.png
Binary file flys-client/src/main/webapp/images/go-first.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/go-up.png
Binary file flys-client/src/main/webapp/images/go-up.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/list-add.png
Binary file flys-client/src/main/webapp/images/list-add.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/save_at.png
Binary file flys-client/src/main/webapp/images/save_at.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/save_csv.png
Binary file flys-client/src/main/webapp/images/save_csv.png has changed
diff -r cfc5540a4eec -r 61bf64b102bc flys-client/src/main/webapp/images/save_wst.png
Binary file flys-client/src/main/webapp/images/save_wst.png has changed
More information about the Dive4elements-commits
mailing list