updated hadoop-specific aggregation workflows
new context creation for UKRI
Procedure for #6000
#5976: remove from beta the funders already in production
Instructions to address the path of multiple funding paths: see also #5695
wfs assignable to entityregistry::products like Datacite
#5695 : addressing the cleanup of projects with multiple funding paths
Query updated to ignore funders that are not ready for production, but have been aggregated into production
exclude projects from BETA funders in the view used by the IIS in production
Missing ';' to properly end statements
fixed table name
SQL commands to copy projects from beta to prod
context for ANR projects
No UNKNOWN funding streams for FP7
logs
Added node to change the last update date of the stats
#5603: Yugoslavian orgs
Fixed counter handling and more logs
Workflow to set the proper date for UI/GUI consumption.
Added OpenAIRE 4.0 compatibility
fixed compilation
log
updated standalone dedup wf, according to the new oozie wf specs
WIP: defining data provision workflow to be run on the OCEAN cluster
WIP: data provision workflow exploiting the oozie jobs on the OCEAN Hadoop cluster
generic OAF compatible aggregation workflow, dedicated to the [Intersection] use case (set DataInfo.invisible = true)
#5297: mappings for the European Environment Agency
Fixing countries for organizations coming from NSF annd not only
workflow for propagation of results to communities through organization
workflow for orcid propagation
Add country to funders
Query to patch the missing countries for funders.
#4962: patching db for summary and funded amount
Including currency
Assert in test
tests solution for #4993 (new xpath for hostedby)
Fixes #4993#note-8 : patch the "good" hosted by node and ignore the one inside resource, if any.
query to import OpenOrgs without acronyms
#4650: need specific interpretation for claims
Fixes #4650: claims to enter HBASE before repository record
fixed trailing char
removed old job nodes
deduped orgs to OpenOrgs DB (jobs + wfs) using temporary hdfs files
#4993: sql to manually add entry in hostedby map
openorgs simrels in provision wf
sql queries for open orgs similarities
update xquery for the community configuration used in the exporter. Added information in the community profile to include the organizations that support the community
workflow for the propagation of result to community through relevant organization. THe organization relevant for the community are given as an input parameter (community.map input) managed by the user in the workflow
Workflow for the propagation of result to communities through semantic relation
introduced distcp workflow to synchronise data from DM to IIS cluster
(openorgs) added schemeid to pids in sql query
temporary test
Import of OpenOrgs Organizations
added summary, totalcost and fundedamount to the projects_mv materialized view
mereged transformation Job spark
#4967: change for datasources
merged with branch dnet-hadoop
#4852: sysimport:crosswalk is enough
removed optional2 and optional2 from project_details table
changed the original name with the English name
fixed wrong number of chars in namespaceprefix
changed the context for considering the founding stream and the change in the name of the funder
Russian Science Foundation context
fixes #4865: do not export OAI PMH URLs that are not used
Fixes #3191#note-20: incremental transformation
Encode baseUrl like in "normal" MDBuilder
Cleaned up namespaces and handle objIdentifier linke in dnet50 mdbuilder.xslt.st
Normalize space of metadata identifier path as it is done in "normal" mdBuilder
Claims2hbase must always run before oaf and odf to hbase to avoid issues like #4650
Updated to current version in production
#4689: changed type for missionstatementurl on dsm_datasources
fixed name of node in bean
added bean for propagation of community result through semantic relation
commented in the applicationContext all the hadoop implementation specific classes
MERGE dnet-openaireplus-workflows dnet-hadoop 56022:56364 into trunk
change param set to standard void
text from wf GUI comes with \\n instead of \n, so we clean the text before posting
Validation step also for aggregator of data repository, as requested in #4297
Added cris and guidelines 4 available for continous validation
add claims in batch instead of all together at the end
Logging
Updates for Selection Criteria implementation
added workflow configuration and hadoop job for orcid publications without doi import
get journal fields from the database
reintegrated branch solr75 -r53774:HEAD
fixed typo querying datasources
Added workflow for posting in the VRE news feed.
Re-implemented POST VRE following instructions at https://wiki.gcube-system.org/gcube/Social_Networking_Service#Write_application_post
distinct PIDs for datasources
merged r55082 from solr75 branch for the workflows for the new CAP and new repository types
Added new case for CRIS compatibility