Government of Canada | Gouvernement du Canada Government of Canada
    FrançaisContact UsHelpSearchHRDC Site
  EDD'S Home PageWhat's NewHRDC FormsHRDC RegionsQuick Links

·
·
·
·
 
·
·
·
·
·
·
·
 

Chapter Seven: CCIF Operations


This chapter will examine how well CCIF did at various administrative tasks associated with the program. It begins with a look at the project selection process. Then there is an assessment of how and how well project results were disseminated to the child care community. Next it discusses the potential for duplication of effort. Then it studies monitoring by staff of the projects. Finally, the role of the Child Care Information Centre is considered.

7.1 Project Selection Process

The program received both solicited and unsolicited proposals. Early in the program when the study of child care was in its infancy, the majority of proposals were not solicited. CCIF was very active in the first couple of years promoting themselves. Staff went to conferences, consulted with the provinces and existing groups, telling them CCIF was available and its terms and conditions. The needs came from the communities and CCIF responded. Communities lacked services and most proposals were easy to fund because most met CCIF's mandate and because there was nothing to compare them with. In the middle years of CCIF, little initiation was needed — people came forward and CCIF helped them put their ideas into a proposal. In the last three years consultants would solicit proposals because they knew specific areas in which work needed to be done. Consultants would approach certain organizations based on their knowledge of the organization and its work and track record.

Virtually all CCIF staff said the same thing about the adequacy of resources: There was enough, or maybe even too much money in the early years but not enough in the last years. 'In the first year, $8-13 million were allocated and it was not possible to spend that amount in the first year of a program — time is needed to set priorities. Faced with this, programs either lapse the money or spend it inappropriately.' The program was slow to take off and had the most money in the middle years. A few CCIF consultants claimed that there was too much money in the early years: 'In the early days resources were spent on things that were not needed not knowing any better. Many cadillac projects were funded.' Another person said early 'projects were funded too generously.' But by the last couple of years, after a series of funding cuts, there were not enough funds to address all priorities. For example, research and training projects, both of which require large budgets, were not adequately covered in the last 4 years.

According to CCIF interviewees, in the last years of the program about 70%-80% of the proposals were declined, the primary reason being lack of funding. The CIS computer system included only 48 rejected cases, translating into an 8.5% refusal rate. But, many project ideas were declined at the conceptual stage, before a serious proposal was written.

Besides designating priority areas related to the child care needs of families in unusual circumstances or in under-served areas, CCIF did little or nothing to identify knowledge gaps or service needs and direct funding to those areas. According to CCIF staff, there was no concerted effort to address one priority more than another. CCIF always did a mix of types of projects. 'There were no allocations to priorities — CCIF was reactive.' Activity areas depended on provincial and community needs. For example, funding for research projects accounted for a larger proportion of CCIF activity in Quebec (27%) than elsewhere because of the provincial government's mandate and insistence on research. Ontario was ready to invest in child care, therefore 44% of development projects were done in that province. Also 27% of demonstration projects were conducted by Aboriginals in part because of the need for child care services on reserves.

Formal priorities did not change from the program's inception. Some priority groups such as special needs, aboriginal and minorities received a lot of attention, but there was no systematic attempt to assess the efforts and change direction. A lot depended on what proposals were received. Only in the last two years did the program take note of areas that hadn't been funded very much such as rural care and school-aged care, and begin to focus on them.

Different activities were emphasized as the program evolved, however. In the early stages, a lot of needs assessments were done because not much was known about local needs. Starting in 1992 or 1993, the program decided to end funding of needs assessments and feasibility studies.

From the start, the program was concerned that demonstration projects had built in a plan to continue funding after CCIF involvement had ended. The program wanted to make sure no 'orphans' were created, projects that would not be funded from another source after CCIF funding ended. Three or four years into the program it became clear that demonstration projects were getting into trouble with the provinces because the projects were putting pressure on the provinces to pick up funding after CCIF funding expired. By 1993 CCIF ended demonstration projects.

The focus shifted more to research, information regarding training (manuals, training tools), facilitating networking and information exchange. Priority was given to policy and program issues — enhanced information, quality resource material, ability to transfer results, and the ability to enhance the work of earlier projects. To some extent, priority was given to underfunded geographic areas, and near the end of the program projects with smaller budgets or co-funded budgets.

All CCIF personnel interviewed believed the program had targeted the right groups. Many also felt that there was more work to do with some of these groups. Because CCIF did not attach priorities to the designated groups, some groups received scant attention — not much headway was made with farm families or minorities, especially immigrants — and others got a disproportionate share of attention — aboriginals and children with special needs.

Moreover, according to CCIF staff, the impact of CCIF was much greater in some provinces than in others (e.g., Ontario had a sophisticated system which CCIF took to a higher level while in the Atlantic region there was nothing and CCIF seeded activities). 'CCIF was a test fund and by design tried to generate ideas for testing/development based on the capacity of the provinces.' The provinces with more capacity got more funding.

There was a wide range in the amount of time it took to approve proposals. For the case study projects, the shortest was two months, the longest two years. On average, it took 11 months from the submission of the original proposal to project approval. The ones that took the least time tended to be follow-up projects. The lengthy process of approval caused frustration for some organizations. 'Project staff began to think they would never get an answer.' One project came close to losing the space it had secured for the project and said community interest waned with the lengthy delay. Some respondents mentioned that it was understood that typically a considerable amount of time was involved in a proposal process; these individuals seemed satisfied with the time it took. Still, as one interviewee put it '18 months seemed somewhat extreme.'

7.2 Information Dissemination

On the whole, projects were responsible for dissemination of their results. Dissemination plans were incorporated in the proposals 'but they were often weak.' For 'some projects it was hard to get them to do more than submit six copies of the final report. Most projects were good at sharing information on a one-to-one basis.'

Some consultants tried to improve on dissemination plans at the proposal stage. Alternatively, consultants could amend the budget to allow for broader dissemination of particularly exciting results or could take on the responsibility of dissemination. This depended on the ability of the organization to do it themselves.

There was disagreement about whether the strategy of putting the onus on the project to diffuse information was appropriate. Some thought it was since CCIF did not have the money or the resources to disseminate end products. Requests for information can go directly to the project. Others felt that dissemination plans should not be built into proposals because it is often hard to tell the merits of the final product at the beginning of the project. 'Therefore it is not necessarily appropriate to budget large amounts for dissemination of the final report.' One person suggested making a publication contribution to worthwhile products so that CCIF could take control and decide what should be disseminated to a larger audience.

Dissemination beyond that done by the projects was usually undertaken by the Information Centre as well as The Canadian Child Care Federation. The method varied depending on the final product: e.g., for a manual developed by the pediatric association there is a huge demand, but for small locally based projects there is not a great demand for their products. Projects were shown in the catalogue but the Federation also put out a manual (funded by CCIF) which described the final products by province. The CCIF synopsis of projects was sent across the country to libraries, NGOs, provincial governments and research institutes. Other methods of dissemination included:

  • supporting the creation of parent resource centres which reach out to parents;
  • information services provided by the Information Centre (which existed before CCIF);
  • 'poster sessions' at CCIF conferences at which end products were displayed. (CCIF put together information brochures on the kinds of projects that had been funded — there was much demand for these brochures); and
  • bringing organizations together and enabling small groups of child care professionals to come under one umbrella and respond to requests for information.

For the most part, NGO respondents said that the information received from CCIF was generally good, current, valid, timely, and helpful. It turns out, however, that most were commenting on quick responses to their requests for specific information from CCIF. When considering dissemination of project results, the story was different. Only if one just happened to hear about a project and then request the document/product, would one ever get the information, according to one NGO. 'There is definitely a need for some kind of strategic communications strategy.' One province agreed, complaining that they had not been given very much information at all: They got no final reports unless they asked. 'In a lot of cases it was just hit and miss.' This representative speculated that for lots of projects there is no final report.

Other provinces had quite different reckonings of CCIF performance with regard to information sharing. Two provinces were pleased all along with the information they had received from CCIF. 'The CCIF provided us with a number of valuable documents/ reports. It has formed an important base of information.' One province said that only recently was there a reliable flow of information regarding certain projects and activities both from within and outside the province. Another thought that the performance of CCIF started off well but degraded in more recent years.

No case study project representative knew of any direct dissemination of their products by CCIF, but then none expected this. Project personnel claimed to be satisfied with CCIF's help in disseminating information; those included in CCIF conferences were especially appreciative.

In the survey we asked whether and how CCIF end products were distributed. Table 7.1 displays the responses.

Demonstration Projects

The most popular means of disseminating findings was through workshops or seminars. Conferences were also used by over half the projects. Nearly one quarter claimed that their findings had been published in specialized journals such as Infoparents and Canadian Parents. But few of the respondents specified the journal and some who did named a newsletter or booklet. 'Specialized journal' apparently means different things to different people. About one in eight demonstration projects never disseminated their results (though some of these had just submitted their results for publication). Nearly 23% of respondents did not know whether CCIF disseminated information about their demonstration project. Of those who did, most thought CCIF was very (22%) or somewhat helpful (42%) in disseminating results. But 20% said CCIF was of very little help and 16% said it was no help at all.

Development Projects

As with demonstration projects, the most popular means of disseminating findings was through workshops or seminars. Conferences were a close second. One-third contended that their findings had been published in specialized journals such as Focus Canadian Children; but again, many did not specify the journal and some of the publications specified did not seem to be a journal. Only 15% of respondents did not know whether CCIF disseminated information about their demonstration project. Of those who did, most thought CCIF was very (30%) or somewhat helpful (36%) in disseminating results. But 20% said CCIF was of very little help and 15% said it was no help at all.

Research Projects

About one quarter of the research projects were published in specialized journals such as Focus, the Canadian Journal of Health and the Canadian Journal of Research in ECE, boosting potential application of the results, and showing that the quality was high enough to meet publication standards. The percentage of research projects so published may be overestimated, however, since few of the respondents specified the journal and some who did named a newsletter or booklet. Most of the other projects used pamphlets, conferences or workshops to circulate their findings. Nine in ten maintained that the research findings are available to people wishing to obtain them. Among the sources:

Source % of Projects
Project sponsor 68.8%
Public library 15.1
Educational institution 29.0
Federal government library 23.7
Government book stores 4.3
Public book stores 2.2
Child care organization 40.9
Other source# 14.0
  N=93

# E.g., Youth organization, CCIF, resource centre

There was some negative sentiment about how helpful CCIF was in disseminating project results. Of those who knew about CCIF's role in dissemination (19% said they did not know), 16% asserted that CCIF was not at all helpful, and 16% said CCIF was of very little help. Still, most felt that CCIF was somewhat (39%) or very (30%) helpful in this regard.

Enhanced Information Services

These projects used 1.5 methods on average to distribute their information. Most often used were publishing for general use and workshops/seminars. Only 18% of respondents did not know whether CCIF disseminated information about their information enhancement project. Of those who did, a high proportion thought that CCIF did not do enough: over one-fifth asserted that CCIF was of no help at all in disseminating their information; a further 14% said CCIF was of very little help. On the other hand 31% felt CCIF helped a great deal, and 34% said they helped somewhat.

Graphic
View Table 7.1 Dissemination of CCIF Project End Products

As further input to the question of dissemination of CCIF products, we asked respondents whether or not they had used products developed by other CCIF projects. Almost two-thirds said yes. Of those who had used other CCIF products, 67% found the products very useful, and 32% found them useful.

7.3 Duplication of Projects

Because project results are usually not widely disseminated, the potential for duplication is heightened. Evidence of duplication is equivocal. The provincial and NGO representatives were unaware of any duplication. Only four stated that there was none; others thought there probably was, but they knew of no instances. Most case study project representatives (7 of the 12) knew of no other CCIF projects in their community. The others maintained that their project did not duplicate previous project(s) they knew of in the community.

Within CCIF there were two different responses to this issue. One group asserted that there was no duplication of information because the projects had to prove they were producing new information. 'There (may) have been some similar products, . . . but there will be something (in each) that is unique to the region.'

The more common view was that there was probably duplication of information. For one thing, most provincial project reports were not translated so some people could not read them. 'There could have been some duplication of information between different provinces.' For another, there appeared to be no mechanism in place within CCIF to check for duplication. Instead, the program relied on the provinces, 'who could point out if a project had been done before.'

Only one example of duplication of information was given by interviewees. The report produced by the Child Care Federation is somewhat like a status of child care which duplicates a document put out in-house. 'However, this doesn't often happen and steps have been taken to rationalize such activities.' Another informant said that feasibility studies got to a point where they were 'reinventing the wheel,' an important reason to stop funding such projects.

More serious evidence of duplication comes from the peer reviewers. They said the Hub Model and the Workplace Day Care projects were concerned with developing child care models that had already been studied and written about. Consequently neither of these projects was judged to have added much to the existing stock of knowledge. We regard this as significant evidence since only 10 projects were subject to peer reviews (and recall that these projects were nominated as excellent by CCIF staff) — i.e., 20% of the projects reviewed were said to have duplicated existing work. This does not mean that 20% of all CCIF projects duplicated previous work, but it raises the possibility that the incidence of duplication was not trivial.

7.4 Monitoring

Following approval of the proposal, the CCIF consultant monitored the project by making on-site visits, by reviewing progress reports and by frequent phone contacts. Projects were also asked to submit a critical path on a quarterly basis by which means any slippage could be identified and sorted out in meetings. At the end of a project, the consultant closed the file after ensuring that the appropriate number of copies of the end product had been received and circulated as planned.

As one interviewee said and several others implied, 'Monitoring was very lax.' This may have been the case for many projects, but not for the ones viewed as consequential by CCIF consultants. Case study projects appear to have been well monitored. Every representative of case study projects asserted that they were satisfied — most said 'very' or 'extremely' satisfied — with the consultation process during CCIF's involvement. They appreciated that CCIF staff were not 'breathing down their necks', but were approachable and helpful when needed. 'Everything was done to accommodate the project's needs.' CCIF staff were said to be accessible, helpful and supportive.

Most case study projects submitted quarterly reports. Contact with the CCIF consultant for some projects was monthly, at least in the early stages of the project. Other projects were in contact on an 'as needed' basis. Reasons for contact ranged from help with preparing reports to invitations to attend workshops. 'CCIF would provide advice for making the program better.' They regularly requested financial information from the projects. CCIF consultants visited most of the sites, usually more than once. Some even visited before the project got under way.

They had a handful of suggestions for improving CCIF's involvement in a project:

  • It would be useful to have a hands on workshop so that projects could better understand reporting requirements to CCIF.
  • It would be helpful to bring similar projects together early in the process to jointly work out and share solutions to common problems (mentioned twice).
  • CCIF should hold an annual meeting with all CCIF project coordinators so that project information could be shared.
  • During the proposal approval process, a more open process of consultation would have been welcomed — i.e., prior to having the proposal approved it would have been informative to know how and why the project was being approved.
  • CCIF must make sure that it has good people on board.
  • Ensure that the same project officer be maintained for the entire duration of the project.
  • Ensure that CCIF's internal information service is on-line at the start of a project so that information on other projects could be easily and readily available.
  • CCIF's accountability forms could have been less intricate and much simpler.

7.5 Child Care Information Centre

The Information Centre's role was primarily to disseminate information about child care derived from CCIF final products and from other sources. It received copies of all final reports and produced a synopsis of projects. The centre responded to all ministerial queries and consultants responded to the public. It also responded to requests for final reports either by sending a copy (if supply was adequate) or by referring requests to the appropriate source. Centre staff attended conferences and distributed products. The centre, which existed prior to CCIF, worked in collaboration with CCIF.

The Centre played a role that the Child Care Federation was also playing. It was suggested by some CCIF staff that responsibility for information dissemination could be privatized (possibly through the Federation). Private organizations can identify other sources of information as well and can tie in with universities better than CCIF. The Federation would need on-going funding although some of their activities generate revenues.

There was an undercurrent of disappointment in the Information Centre among some CCIF staff. 'The Information Centre did not play a major role. . . It was not very active in disseminating products.' Another person stated 'it could have had a much larger role.' The Centre created a number of papers on what CCIF supported. 'These are good but late and should have been started earlier. A book listing projects was to be compiled each year but only one was completed.'

7.6 Conclusion

This chapter identified some problems in CCIF's administration. Most importantly, it confirmed a conclusion from the previous chapter that CCIF did little or nothing to identify knowledge gaps or service needs and direct funding to those areas.

CCIF was given high marks for being responsive to requests for information. Beyond this, however, many informants were critical of CCIF's record of proactive dissemination of project results. For example, over a third of project representatives considered that CCIF had been ineffective in this area. Some interviewees also noted that the only way they would get a report was if they just happened to find out about it and asked for it.

Because there has been no systematic analysis of past projects and because distribution of project results is haphazard, the potential for duplication of effort is high. Moreover, CCIF had no systematic procedures for preventing it (e.g., CCIF staff did not make use of the computer system to prevent duplication). Some evidence of duplication was uncovered: most importantly, 20% of peer review projects were considered to duplicate existing work.

On monitoring, again the message seems to be that CCIF was passive. To be sure, CCIF staff were complimented for the help given to projects: they helped mold the proposals, visited project sites and gave helpful advice when asked. But how good staff were at discovering problems without being asked is open to question. We were told that monitoring was lax.


[Previous Page][Table of Contents][Next Page]