modified the setter method to the default setter
Update for definition of mapping between field and proto element for the selection criteria implementation
Save interface descriptor data in the aggregation workflow logger entries #4470
Implemented Hadoop Transformation
modified transform hadoop workflow
complete wf of collection
mock version
implemented tranformation hadoop node as blackboard node
added parameters to job (apidescriptor, name, path)
replaced by SubmitDnetHadoopJobNode
- added new Node for submitting hadoop Job and get info using rabbitmq client
renamed wrong name of the module
added nodes for reading and writing version
nodes for aggregation in hadoop
added get info node after collection
Cache evict for direct index api called on the endpoint specified by the new property dnet.openaire.index.api.cacheEvictUrl
merged from trunk -r53774:HEAD
branch for solr 7.5.0
new consistency nodes
ovveride an sql query
Workflow to test invalid repo profiles
fixed a bug with openaire Ids
Fixed progressProvider
Added logs
reimplemented method to refresh store sizes
added a condition to reduce the number of updates
Consistency of sizes (mdstores,objectstores,repo profiles and db)
node for bulktagging
addDataInfo instead of set
added WF for tranform scholexplorer links into actionsets
write on dsm_organizations
updated property name
To support incremental update of projects in the database, useful functionality of IncrementalTransformationJobNode has been extracted to a superclass. Wf template for entity registries has been updated accordingly. Wf templates for other types of ds that were already in incremental mode have been updated for the new param name inherited from the superclass.
moved directIndex API in a new module
Added wf parameter for dashboard visibility
added missing class
upgraded version of jdbc driver for postgres, this includes a lot of changes, Because we have to add @transactional annotation in some properties to avoid connection closed exception
merged branch dsm into trunk
#3392: added funder param to FET-H2020 projects
stream claims from the jdbcTemplate instead of load them in memory
fix bug for fet-h2020 context
trivial fixes
use single quote instead of double quote
integrated (hopefully) all required changes from dnet40
Fixed implementation for keeping track of progress
fixed parameter order
Node for writing context updates on hbase and refactoring
ApplyClaims with progress provider
Let's not fail for a non well-formed claim: we go on and keep track of it.
Updated toString
Fixed order of arguments
Fixed semantics and added test
Fixed ApplyClaimRels job param and mapping datasets and pubs to results
Using database service classes to avoid exception 'Method org.postgresql.jdbc4.Jdbc4Connection.isValid(int) is not yet implemented.'. We are locked to an old jdbc driver (for hibernate?)
Getters and setters
nodes and workflows for the new claims
Classes and configuration for writing claimed relationships into HBASE
Claims metadata are collected from the file system, not from the database.
migrated to dnet45
codebase used to migrate to java8 the production system