Baselight

CORDIS - EU Funded Projects Under FP1 (1984–1987)

Publications Office of the European Union

@eupublicationsoffice.fp1_cordis

Loading...
Loading...

About this Dataset

CORDIS - EU Funded Projects Under FP1 (1984–1987)

This dataset contains projects funded by the European Union under the first framework programme for research and technological development (FP1) from 1984 to 1987. The file 'FP1 Projects' contains the public grant information for each project, including the following information: Record Control Number (RCN), project ID (grant agreement number), project acronym, project status, funding programme, topic, project title, project start date, project end date, project objective, project total cost, EC max contribution (commitment), call ID, funding scheme (type of action), coordinator, coordinator country, participants (ordered in a semi-colon separated list), participant countries (ordered in a semi-colon separated list). The participating organisations are listed in the file 'FP1 Organisations' which includes: project Record Control Number (RCN), project ID, project acronym, organisation role, organisation ID, organisation name, organisation short name, organisation type, participation ended (true/false), EC contribution, organisation country. The dataset has been updated to match the structure of more recent datasets - some fields may not be populated Reference data (countries, funding schemes/types of action, subjects (SIC codes)) can be found in this dataset: https://data.europa.eu/euodp/en/data/dataset/cordisref-data
Publisher name: Publications Office of the European Union
Publisher URL: https://op.europa.eu
Last updated: 2024-11-19T03:37:20Z

Tables

FP1 Projects

@eupublicationsoffice.fp1_cordis.fp1_projects
  • 4.15 MB
  • 3,165 rows
  • 46 columns
Loading...
CREATE TABLE fp1_projects (
  "n_1560_skids_signal_and_knowledge_integration_with_dec_8f76b347" VARCHAR  -- 1560;"SKIDS";"";"Signal And Knowledge Integration With Decisional Control For Multi-Sensory Systems";"1987-06-01";"1989-06-01";"";"";"FP1-ESPRIT 1";"";"";"FP1";"";"";"";"";"Thepurpose Of The SKIDS Project Is To Provide A Basic Generic Approach,
  "n__for_both_software_and_hardware" VARCHAR  -- For Both Software And Hardware,
  "n__in_the_area_of_integration_of_sensory_information_a_c752a123" VARCHAR  -- In The Area Of Integration Of Sensory Information And Knowledge. "Sensory Information" Is Understood As Information Coming From An Outside,
  "n__physical" VARCHAR  -- Physical,
  "real_world" VARCHAR,
  "n__and_knowledge_as_high_level_symbolic_representation_8f549daf" VARCHAR  -- And "knowledge" As High-level Symbolic Representations And Models Of The External World And Of The System's Features And Abilities. Such Models Are Dynamically Updated And Partially Acquired Through Learning. The Ultimate Goal Of The Project Is A Perception Machine Represented By The SKIDS Demonstrator Prototype And Realising: -a Unified Perception Of The Observed World -real-time Reasoning,
  "n__planning_and_adaptation_of_the_whole_software_and_h_42742eb9" VARCHAR  -- Planning And Adaptation Of The Whole Software And Hardware Configuration To The Actual Observations Strategy. The Purpose Of The Project Was To Provide A Basic Generic Approach,
  "n__for_both_software_and_hardware_1" VARCHAR  -- For Both Software And Hardware.1,
  "n__in_the_area_of_integration_of_sensory_information_a_c_ed7e4f" VARCHAR  -- In The Area Of Integration Of Sensory Information And Knowledge. Sensory Information Is Understood As Information Coming From An Outside,
  "n__physical_1" VARCHAR  -- Physical.1,
  "n__real_world" VARCHAR  -- Real World,
  "n__and_knowledge_as_high_level_symbolic_representation_dd74bfda" VARCHAR  -- And Knowledge As High Level Symbolic Representations And Models Of The External World And Of The System's Features And Abilities. Such Models Are Dynamically Updated And Partially Acquired Through Learning. The Demonstration Environment Where The Prototype Perception Machine Will Run Has Been Specified,
  "n__in_particular_the_sensor_configuration_the_function_e6dfe62d" VARCHAR  -- In Particular The Sensor Configuration. The Functional Architecture Has Been Defined,
  "n__and_consists_of_4_parts_the_mmi_the_sensory_chain" VARCHAR  -- And Consists Of 4 Parts: The MMI The Sensory Chain,
  "n__the_interpretative_chain" VARCHAR  -- The Interpretative Chain,
  "n__and_the_control_and_decisional_chain_the_last_2_par_8b9596b3" VARCHAR  -- And The Control And Decisional Chain. The Last 2 Parts Are Essential: The Interpretation Processs,
  "n__which_is_driven_continuous_surveillance_task_or_goa_ced778dc" VARCHAR  -- Which Is Data Driven (continuous Surveillance Task) Or Goal Driven (object Recognition Upon Request) Is Segmented Into Elementary Tasks Which Are Driven By The Knowledge Based Control System (KBCS). The KBCS Selects The Optimal Interpretative Path And Manages The Global Resources Allocation. The Basic Perception Tasks That Have Been Identified Fall Into 5 Categories: Detection,
  "n__characterization" VARCHAR  -- Characterization,
  "n__localization" VARCHAR  -- Localization,
  "n__tracking_and_identification_the_sensors_consist_of__49751d9e" VARCHAR  -- Tracking And Identification. The Sensors Consist Of Fixed And Pan And Tilt Cameras,
  "n__microphones" VARCHAR  -- Microphones,
  "n__optical_barriers" VARCHAR  -- Optical Barriers,
  "n__a_laser_range_finder" VARCHAR  -- A Laser Range Finder,
  "n__an_ultrasonic_belt" VARCHAR  -- An Ultrasonic Belt,
  "n__and_an_odometer" VARCHAR  -- And An Odometer,
  "n__all_mounted_on_a_mobile_platform_the_hardware_has_a_3f6e58f2" VARCHAR  -- All Mounted On A Mobile Platform. The Hardware Has Already Been Specified And Consists Of A Set Of Nodes Linked Via A Ring Bus. Basic Tools For The Software Architecture Have Already Been Identified: They Include Inference Engines And A Rule Compiler For Achieving Real Time Performance Of The Perception System. The Objective Is To Achieve Response Time Of A Few Seconds For Indoors Scene Surveillance. The Fusion Of Information From Multiple Cameras Has Been Demonstrated Successfully For Single Event Tracking. Various Tasks Of Detection,
  "n__localization_and_recognition_demonstrated_t_he_soun_3fe969b9" VARCHAR  -- Localization And Recognition Demonstrated T He Soundness Of The Vision Node Architecture.The Demonstration Environment Where The Prototype Perception Machine Will Run Has Been Specified,
  "n__in_particular_the_sensor_configuration_the_function_5ea9e9f5" VARCHAR  -- In Particular The Sensor Configuration. The Functional Architecture Has Been Defined.1,
  "n__and_consists_of_four_parts_the_mmi_the_sensory_chai_68ebb3ec" VARCHAR  -- And Consists Of Four Parts: -the MMI -the Sensory Chain -the Interpretative Chain -the Control And Decisional Chain. The Last Two Parts Are Essential: The Interpretation Process,
  "n__which_is_driven_continuous_surveillance_task_or_goa_e29f4f81" VARCHAR  -- Which Is Data-driven (continuous Surveillance Task) Or Goal-driven (object Recognition Upon Request) Is Segmented Into Elementary Tasks Which Are Driven By The Knowledge-Based Control System (KBCS). The KBCS Selects The Optimal Interpretative Path And Manages The Global Resources Allocation. The Basic Perception Tasks That Have Been Identified Fall Into Five Categories: Detection,
  "n__characterisation" VARCHAR  -- Characterisation,
  "n__localisation" VARCHAR  -- Localisation,
  "n__tracking_and_identification_the_sensors_consist_of__4_5b23f3" VARCHAR  -- Tracking And Identification.The Sensors Consist Of Fixed And Pan-and-tilt Cameras,
  "n__microphones_1" VARCHAR  -- Microphones.1,
  "n__optical_barriers_1" VARCHAR  -- Optical Barriers.1,
  "n__and_a_laser_range_finder" VARCHAR  -- And A Laser Range Finder,
  "n__an_ultrasonic_belt_1" VARCHAR  -- An Ultrasonic Belt.1,
  "n__and_an_odometer_1" VARCHAR  -- And An Odometer.1,
  "n__all_mounted_on_a_mobile_platform_the_hardware_has_a_c511bb8d" VARCHAR  -- All Mounted On A Mobile Platform. The Hardware Has Already Been Specified And Consists Of A Set Of Nodes (VME Clusters) Linked Via A Capitan Ring Bus. Basic Tools For The Software Architecture Have Already Been Identified: They Include Inference Engines And A Rule Compiler (KHEOPS) For Achieving Real-time Performance Of The Perception System. The Objective Isto Ach Eve Response Time Of A Few Seconds For Indoors Scene Surveillance. The Fusion Of Information From Multiple Cameras Has Been Demonstrated Successfully For Single Event Tracking. Various Tasks Of Detection,
  "n__localisation_and_recognition_demonstrated_the_sound_473b1e49" VARCHAR  -- Localisation And Recognition Demonstrated The Soundness Of The Vision Node Architecture,
  "n__which_consists_of_a_datacubesystem_connected_to_a_t_f0da7ef5" VARCHAR  -- Which Consists Of A Datacubesystem Connected To A Transputer Array And Hosted In A SUN3. Exploitation The Approach Is Basically A Generic One,
  "n__but_is_driven_by_two_classes_of_application_mobile__7fe596df" VARCHAR  -- But Is Driven By Two Classes Of Application: -mobile Robots For Public Safety Applications In Nuclear Plants,
  "n__etc_surveillance_systems_for_offshore_oil_fields" VARCHAR  -- Etc -surveillance Systems For Offshore Oil Fields,
  "n__nuclear_plants" VARCHAR  -- Nuclear Plants,
  "n__airports" VARCHAR  -- Airports,
  "n__etc_1992_11_25_00_00_02_8483" VARCHAR  -- Etc. ";"1992-11-25–00:00:02";"8483";""
);

Share link

Anyone who has the link will be able to view this.