Aller au contenu Aller à l'accueil plan du site rechercher Aide sur les raccourcis clavier

Prime relations with the Commission : 4 central debates Home > Governance >

Prime relations with the Commission : 4 central debates


Relations with the Commission are classically separated into scientific and financial.
Beyond classical bilateral exchanges between the Scientific Officer and the coordinator, the scientific relations in our case, were mediated by a central medium – annual monitoring by a 3 persons group. We, as policy analysts and the coordinator as a so-called ‘expert’ in evaluation practices, have heavily questioned this approach, compared to other choices (see for instance the FP6 2004 monitoring report). This was explicitly mentioned in our annual Strategic reports. This was reinforced by the very limited exchange that took place in the first 3 rounds of monitoring. This being said, the reports pushed us in adapting our strategy (in particular vis-à-vis research on the ERA) and in improving our reporting system (an important lever to the evolution of the management platform).
Meanwhile over the 4 years of monitoring, the same issues remained that are important to record here, because they touch upon the approach to the notion of ‘Network of excellence’ and apply to any other policy instrument aiming at fostering the structuration of problem-focused communities of practice, or said otherwise transepistemic communities.

1) Which extension ? The fascination for new member states. Blind eyes to the South.
We had a view that such a network should be built around a core of permanent active members with number of associate members, which would join because of a given thematic interest, for training reasons, etc. This was excluded by the Commission which required that all participants should be on an equal footing, and then spent its life moaning against too large networks and the need for reduced ones (including the review of NoEs which completely forgot about these historical impediments). The review of team involvement demonstrates that we were right in this approach, with a core of around 15 core institutions.
Having some 42 members from the start did not prevent us to apply our approach to the ‘space’ covered by the network. Our approach was clear cut : it had to be based bottom-up, through the inclusion of teams in selected projects on the basis of the new capabilities they were bringing. We thus introduced 11 new (associate) members during the life of the NoE.
The view of the Commission differed however. Their sole focus was enlargement towards the new member states. What has struck the coordinator, is not this objective but rather the inability of having a structured exchange on this issue, even after we demonstrated that the issue was structural (the absence of existing research collectives and of the organisational or political will to develop them, see page "Fostering the national development of SPRI capabilities in new member states"). This even drove to bluntly reject an attempt to support an extension towards Africa, where we demonstrated that there were interesting and active capabilities (while on the other hand Africa was politically considered as a clear priority for development).

2) Which dissemination policy vis-à-vis stakeholders ? Create visible things vs use existing channels
Debates around dissemination acted as a revealer about the very different views that co-existed about networks of excellence. Were they a new organisational structure dedicated to the replacement of existing ones ? Or were they a ‘superstructure’ which role was to enhance the capability of its members to perform better (including sometimes through fostering some form of aggregation) ? These two situations bear heavily upon the approach towards dissemination and interaction with stakeholders. The first one drives to identify all what member researchers do with the network, while the latter one respects their institutional affiliations and puts them on the forefront in term of credit. We clearly chose the second alley, in a way accepting that the NoE plays a secondary role in the valorisation of achievements arrived at by participating researchers (this was also in line with respective financial investments made). We followed a similar approach vis-à-vis academic dissemination, privileging existing channels (in particular the well established journals which probably explain the visibility of our small speciality). A third issue arose dealing with dissemination vis-à-vis stakeholders and in particular policymakers. Page 10 this section explains how we progressively developed an indirect approach to the circulation of results and new approaches developed. While we consider this ‘percolation process’ as rather successful, it was not very visible and seldom associated directly to the NoE. We did not consider this important, since at the same time PRIME was gaining (at least we thought it) a strong recognition within the community at large. Did this explain the reason why we were asked by the review panel to reconsider our approach and to create a European AAAS ? We cannot know since there were limited exchanges with review panels (see below). We have explained in multiple notes, papers, and on the website why we considered it inadequate, and that it was more adapted to foster the participation of members to the numerous and ever increasing number of conferences, workshops and other types of meetings organised or supported by the Commission, the issue has remained on the agenda for quite a number of years…

3) Logistics and the issue of financial reporting.
The reader has just to look at the annual reports to see that already we started the second Joint Programme of Activity (JPA) with nearly one year of delay and that delays have been a recurrent feature of this process. By and large yearly JPAs were accepted at the time they were supposed to finish, increasing year after year the time lag between the agreement about the activities to undertake and the time where we could start them. This was in part due to the time taken by reviewers to submit their reports, but even more by the problems associated to the handling of financial aspects. Nothing was clear about audit certificates (who should have them, over which amount and how often), nothing was clear about how to fill them (for 3 years filling changed with each financial officer, one requiring different ways of filling them from the previous one, while we had more than 5 in 3 years…). It went to the point where the coordinator and the professional management body considered asking for an early closure of the contract. A satisfactory solution was found in the end. Still after more than one year after the scientific closure of the contract, we still struggle with all the requirements associated with marginal involvement of a large number of participants, a headache both for parties. In a way we pay a very classical tribute to innovation studies : some early small events have strong consequences on the later trajectory of the potential innovation ! Should the Commission have accepted two categories of members that probably most of the financial hurdles faced would have been removed.

4) How to monitor ? What can 3 reviewers do ? And at which speed ? Impacts on project dynamics.
The division in charge of monitoring the NoE in social sciences and humanities chose a very crude approach to monitoring, sending reports to a group of three persons and asking them to review alongside a set of formal standard questions. We strongly argued against such a process for two main reasons.
The first reason was linked to the interaction between the 3 members and the network. For instance in the first two reports there were no exchanges, no discussion about their views. We received the review report and that was it ! Only did this significantly changed in the last review, though there were limited exchanges on the conclusions arrived at by the review panel. This drove us to produce replies. Following this trail … is quite interesting for an evaluation specialist but rather unpleasant for those facing these reviews and their effects.
The second reason was associated with the process itself. Already the FP6 2004 review panel addressed the issue. In this report the review practices of the different programmes are analysed and the conclusion arrived at is that (i) face to face auditions based upon a set of questions raised by reviewers has proven to be the most effective learning process, and (ii) this was even better when the auditors worked as a panel (and not on a single network) so that they could compare and link things together (keeping thus in mind not each action individually but the overall sub-programme).
The coordinator still thinks that the the yearly approach was inadequate when dealing with long term action. The time beweeen 2 reviews should have been at least 18 months. And it would have been more fruitful to have both the strategic report of the network executive structure and its analysis (associated with advice about changes and evolutions) by the network scientific council (see page 4). This would have fed a panel overseing a number of networks as proposed by the above mentioned report.
 

julien

Navigation



PRIME Website © copyright Prime 2009, all rights reserved.