- Richard Berner – Director, Office of Financial Research, U.S. Department of Treasury
- Benoît Coeuré – Member of the Executive Board & Chairman, Committee on Payments and Market Infrastructures (CPMI) European Central Bank (ECB)
- Jon Cunliffe – Deputy Governor, Financial Stability, Bank of England
- Verena Ross Executive Director, European Securities and Markets Authority (ESMA)
- David Wright – Secretary General, International Organization of Securities Commissions (IOSCO)
- Colin Ellis – Chief Credit Officer, EMEA, Moody’s Investors Service
- Stefan M. Gavell – Executive Vice President, Regulatory, Industry and Government Affairs, State Street Corporation
- Larry E. Thompson – Vice Chairman, The Depository Trust & Clearing Corporation (DTCC)
The importance of financial data for supervisors and market participants was stressed
High-quality data are essential for macro-prudential supervision, for assessing and monitoring vulnerabilities in the financial system and for overseeing its functioning. High-quality data are also essential for micro-prudential supervisors as they allow them to understand what is happening in their markets and in individual institutions.
Market data are also critical for helping industry participants to assess, price and manage risks and react quickly to potentially destabilising events in the market. Trade execution data are also needed for enhancing market efficiency and the straight through processing of trades.
Much progress has been made in the provision of data since the crisis
Prior to the crisis the public authorities relied mainly on domestic and cross-border banking statistics and did not appropriately take into account the whole systemic dimension and complexity of the financial system with all the range of actors and inter-connections of which it is composed – i.e. banks, non-banks, and other financial companies and markets.
Many international initiatives at the global and regional levels have been launched to improve information and remove the main data gaps. Much progress has been made in the OTC derivatives field in particular with the setting up in major jurisdictions of trade repositories that are collecting data on a massive scale. These data are still incomplete but are already being used for supervisory purposes e.g. to monitor potential financial stability events on a domestic level. About 90% of derivative trades are now subject to some kind of reporting. Progress is also being made in the monitoring of Securities Financing Transactions (SFT) such as repo and securities lending. Standardisation is also growing with the on-going implementation of Legal Entity Identifiers in particular (LEIs). Moreover projects to improve the pre and post-trade transparency of non-equity transactions are underway.
Many issues however remain to be addressed with regard to the coverage, quality and accessibility of market data
Seven years after the crisis a credible set of data to mitigate potential systemic risks on the global level has still not been put together, which was described by some panellists as a collective failure.
Data gaps remain to be closed in many areas: derivatives, shadow banking, credit information, bank funding. The key priorities pointed out by some panellists are to address the data gaps related to OTC derivatives and Securities Financing Transactions (SFT) since these are the main sources of exposures between counterparties and are huge sources of potential intra-financial system stress.
The fragmentation of OTC derivatives data across jurisdictions due to the multiplication of trade repositories (TRs) - many countries have developed their own - is a second important issue as it may prevent supervisors from having a clear picture of exposures at the global level and appropriately anticipating future risks. This problem is compounded by the differences in reporting requirements across regions, notably the EU and the US, regarding e.g. OTC derivative transactions or hedge funds, which makes the combination of information across jurisdictions very complex and labour-intensive and creates sources of misinterpretation and overlaps.
One way to limit the consequences of this fragmentation would be to have appropriate data sharing and coordination mechanisms across jurisdictions but these are not sufficient at present. The objective of tackling the legal obstacles to sharing data between jurisdictions was recently emphasized by the G20, but this may be difficult to achieve due to data confidentiality and security issues stressed by domestic supervisors. Trust needs to be reinforced among the public authorities to encourage them to share data more extensively. Stronger coordination among regulators would also facilitate the development of a more efficient infrastructure to support data provision.
Finally another issue highlighted is the complexity for the public authorities of dealing with the huge volume of data generated by OTC derivatives reporting in particular, which requires specific expertise and resources that central banks and macroprudential authorities do not usually possess at present.
A wide range of actions can be envisaged to improve market data and the sharing of it at the international level.
Implementing common identifiers for entities, products and transactions and standards for granular data is a key priority and is probably the most tractable improvement objective in the area of data. Work is ongoing in this regard at the international level with the development and implementation of Legal Entity Identifiers (LEI), Unique Product Identifiers (UPI) and Unique Transaction Identifiers (UTI) in particular, but should be accelerated some believed. Suggestions were also made to rely on existing standards such as ISO standards when they exist and can be useful.
Developing processes to ensure an appropriate validation of the quality of the data that is entered into the TRs was also suggested.
Creating a global hub that could facilitate information sharing across jurisdictions in the same way as the BIS for the banking sector and ensure the confidentiality and security of the data was proposed. Such a hub could also potentially concentrate the expertise needed to manage the large amounts of data generated by the reporting. A hub for metadata (i.e. showing what data exist, where they are and what they define) could also be envisaged as well as the development and dissemination of a set of best practices for sharing data at the international level.
Private-public partnerships are another option to consider for managing market data several panellists suggested in order to leverage private expertise and resources in the area of “big data”. Such partnerships could also help in the setting of standards, either reusing existing ones or developing new ones. Private-public collaboration with the possible contribution of academia could also supplement the analysis of the data collected.
Some areas where public leadership is essential were however highlighted, such as the launching of projects to consolidate and disseminate information on an industry-wide level (e.g. through databases or websites) or the setting and monitoring of timeframes within an overall roadmap to improve data for a given sector of the industry. There is also a role for the public authorities in the steering of pilot projects related to data collection in order to test their feasibility and added value.
Finally, the panellists all stressed the need for strong and continued political impetus in order to enforce deadlines and ensure the implementation of the solutions needed to solve the current data aggregation and sharing issues. Political momentum is notably important for achieving the implementation of common identifiers and data standards and for bringing public authorities to collaborate more at the international level, some suggested.
1. The importance of market data for supervisors and market participants
The importance of market data was stressed by all the panellists.
An official underlined the importance of improving the coverage, quality and accessibility of financial data and of developing partnerships between policymakers and the industry to provide the appropriate information at minimum cost, without duplication and for mutual benefit. High quality data are equally important for policymakers and market participants. For policymakers, they are essential for assessing and monitoring vulnerabilities in the financial system and for overseeing its functioning. For the industry, they are critical for assessing, pricing and managing risks.
Another official agreed that data are useful for macro-prudential supervision and for several other reasons. Macroprudential supervisors need to see the "overall picture" and in a world of globally-integrated capital markets and financial sector, they need to be able to "put together" the picture from the main jurisdictions. This requires standardisation, sharing data and the right coverage. Data are also essential for micro prudential regulators who need to understand what is happening in individual institutions and in their markets. They are useful for the market, for investors and credit rating agencies, because the market needs to understand institutions.
An industry representative emphasized the importance of market data for industry participants. Whilst having sufficient capital and liquidity are important, the single element that will make the biggest difference over the long term is appropriate data. The private sector needs data because markets tend to react very quickly. Many of the problems experienced in the financial crisis and the Eurozone debt crisis were due to a lack of data and information about bank balance sheets, exposures, etc. Moreover, improving data is also necessary for enhancing market efficiency. Data on the settlement or affirmation of securities trades is needed for example to move towards more straight through processing in markets such as fixed income which are lagging in this respect compared to e.g. equities.
A market observer added that efforts to uncover new sources of information and improve existing information contribute to improve the quality of the analyses that are needed in the private sector as well as in the public sector. From a user perspective, standardisation, accessibility and availability of data are essential.
2. Much progress has been made in the provision of data since the crisis
An official acknowledged some shortcomings of supervision prior to the financial crisis. In the period from 1990 to the crisis, there was a huge explosion in the complexity and coverage of the financial system. The authorities depended on domestic and cross-border banking statistics and thought that was sufficient to understand what was happening in the financial system. There was "a very rude awakening" when it was discovered that Lehmann Brothers had 900,000 derivative contracts and AIG had about $ 90 billion of repo lending and the authorities did not know where these were. Then, some links were discovered between credit derivatives in the US and the EU which caused some significant problems in the weeks after the collapse of Lehmann and near-collapse of AIG. At that time, the authorities did not pick up the complexity, the range of actors and the range of intra-financial system exposures and vulnerabilities that were happening. They depended on "a previous era of data sharing" through the BIS and banks. Following this, the G20 mandated that data should be collected in order to have a system-wide overview. However, it was insufficient just to mandate the legislation to require this, as was done for OTC derivatives, other things were required to make sure the data were usable. This was left to the implementation stage. The current problems are due to the failure to follow through at that time.
Several panellists detailed the progress that has been made in the collection and provision of data since the financial crisis.
An official stressed that substantial effort and resources have been devoted towards obtaining good financial data. Some progress has been made, even if there are still significant gaps and overlaps and neither quality nor accessibility are yet sufficient. Good progress has been made in the derivatives market in particular with about 90% of derivative trades now being subject to some kind of reporting regime, an industry representative pointed out.
There are already a number of international initiatives to improve information and remove the data gaps, a regulator explained. There is the data gap initiative which is sponsored by the G20, which has been very useful. As a member of the inter-agency group on economic and financial statistics, the ECB has been deeply involved in developing and implementing these recommendations and there have been important achievements. New statistics on securities holdings which provide a very detailed insight into which securities are held by whom in the Eurozone have been introduced. In conjunction with the BIS and IMF a handbook on securities statistics has been published, which allows the collection of better securities data. There is also the new ECB regulation on supervisory financial information which was mandated in the context of the Single Supervisory Mechanism (SSM) and which will gradually extend reporting requirements to entities that have not yet been reporting.
A regulator concurred that much progress has been made. In respect of OTC derivatives, trade repositories in different jurisdictions are collecting data on a massive scale. Even if the data collected is imperfect and incomplete much progress has been made in the available data compared to a few years ago and these data are actually being used for supervisory purposes providing a great benefit. The first objective is therefore to use effectively the data that is already available. There are also some important approaches to standardisation e.g. with the Legal Entity Identifiers (LEI) in particular which represent a huge step forward.
An official agreed that the whole data provision system does not need to be completed for the finished parts to be used, even if the benefits of the whole system cannot be obtained. Data from European repositories is currently being used, even if it has to be completed. It has also been used for research. This is proof of its value, but much more could be concluded from such data.
There is also much work being conducted in the US focusing on some critical data gaps in Securities Financing Transactions (SFT) in particular an official stressed. A guide to US repo and securities lending transactions has just been published by the Office of Financial Research (OFR) pointing out the gaps and where they could be filled. CCPs are another area where much work is taking place related to the transactions that they clear and settle. Shadow banking remains a huge issue though. The interconnections across the financial system involving shadow banking activities are being mapped out, developing the work that was started five or six years ago by the Fed.
3. Many issues however remain to be addressed with regard to the coverage, quality and accessibility of market data
Data gaps and data collection issues
The panellists generally agreed that significant data gaps remain to be closed and that the efficiency and consistency of data collection need to be improved.
A regulator emphasized that the regulatory community has so far failed to pull together a "credible set" of data on the global level and this is a collective "public good failure". The general public would simply not understand that seven years after a crisis in which 25% of GDP was lost, according to a FSB document, the regulatory authorities still do not have all the data needed to monitor systemic risks.
The areas where data gaps are the most significant were further detailed by several speakers.
There are multiple examples of areas where potential risks have been identified and where data are partial and fragmentary, a regulator stressed.
Derivatives are a first area where there are significant data gaps, a regulator emphasized. Having the ability to monitor the size of derivative markets and the inter-connections between the financial market and market infrastructures is key. In the assessment of risk, data collection can support the regulatory efforts to make financial infrastructure safer and to understand the inter-connectedness within the financial system better. There has been much work within CPMI and IOSCO on stress testing CCPs and this requires the provision of appropriate data.
Credit is a second area where there are significant data gaps a regulator stated. The availability of complete and granular credit register information and the risk parameters going with the credit information are critical to understanding the concentration of exposures throughout banking groups. The absence of this information and of relevant databases means that supervisors have to collect additional data on an ad hoc basis to perform their assessments e.g. stress tests. This additional data collection and the need to ensure the quality of the data place a large burden on banks and also on the supervisors. In this respect, some kind of process to provide supervisors with information on a regular basis would reduce costs.
Bank funding is another example of an area with data gaps. A timely and comprehensive overview of bank access to different funding sources in the money markets in Europe and of the conditions under which banks can fund their activities is still missing.
It is also important to get more insight into the shadow banking sector notably in terms of leverage, exposures, liquidity, the regulator added. This is an important part of the FSB work from a regulatory standpoint. The sector is continuing to grow and to increase its share of financial intermediation, including lending to the real economy. More comprehensive reporting and more reliable data on shadow banking entities and activities is needed to be able to extend the macro prudential framework to include shadow banking. The forthcoming EU Regulation on SFT will be very helpful in this respect. With the AIFMD, the Alternative Investment Fund Managers Directive, data are now also starting to be collected on AIFs, a regulator added.
Other areas where significant data gaps remain to be closed were cited by some panellists.
In many other financial areas data collection is only just "coming on stream", a regulator explained. In addition to AIFMD and SFT data, this is the case of settlement data and all the trade data related to MiFID. Regarding MiFID, regulatory authorities in the EU are working on cooperation mechanisms to build the necessary collection processes and systems, and then collectively publish the data when it becomes available. The aim is to create a market infrastructure that actually supports the market as a whole providing the transparency of data that is essential. The national competent authorities have delegated a number of functions to ESMA to make that collection, both from a data quality perspective, but also from an efficiency and ultimately taxpayers' money perspective, as effective as possible. Over time this will change the whole picture of trading data in Europe, the regulator claimed.
A market observer emphasized that it is important not to lose sight of the remaining issues raised by some of the data that were available before the crisis such as RWAs or accounting standards and in some cases it is only when data start to be used that problems appear. With regard to accounting standards, there is more work to be done. There are still differing levels of enforcement and differing levels of quality of accounting. This means that a great deal of time is spent by those using them on trying to clean up and standardise the data. Improvements in this area would be felt throughout the private sector as well. It is important to acknowledge the progress that has been made (e.g. in Pillar III disclosure requirements) but some very simple elements, like common formats for electronic data or global templates seem to be missing. For example, the FR Y-9C1 is the best bank balance sheet data submission in the speaker's view because although it is very basic it is fairly consistent over time and across companies. It would be good to have a similar comparable reporting format on a global basis.
Areas to focus on as a priority
An official emphasized that given the limitations of potential political and regulatory intervention it is necessary to focus on some key areas. Identifying the main exposures between counterparties which may bring the financial system down is essential, in addition to the actions under way to adjust capital requirements in the banking sector. In this respect priority should be given to improving data regarding derivatives which are a "huge source of intra-financial system stress" and Securities Financing Transactions (SFT) such as securities lending and repo for which there should be a wider coverage. SFT were initially outside the scope of the initiatives launched following the G20 commitments but have since been picked up by the FSB, the European and US public authorities and other major jurisdictions should also move quickly on this.
A regulator agreed that derivatives are the immediate priority. However, there are a number of other issues which equally suffer from the paucity of the data available and its lack of standardisation. One of the biggest problems is differences in accounting systems, for example between the United States and Europe, in the context of cross-border resolution. Despite all the progress that has been made on resolution it is not possible to agree on the definition of impaired assets for example. Having an immediate and clear view of banks' exposures is also essential if one of them gets into trouble. This needs to be addressed with a strong political impetus and some "pretty demanding deadlines".
Inconsistency of data requirements across regions and fragmentation of data
Several speakers stressed the difficulties that differing data provision and reporting requirements across regions create for markets and activities that have a global dimension such as derivatives and hedge funds.
Derivatives are probably the most global and consistent product, an industry representative stated and it is therefore "disappointing" that there are so many different reporting conventions around the world. Products are very similar across jurisdictions (e.g. interest rate swaps in the US, UK, Japan, etc. are basically the same product) and there are no legacy systems involved. The ability for the private sector to create systems that can provide the information in an efficient way with different reporting conventions is a big challenge.
Hedge funds are an area where there are similar inconsistency issues in terms of reporting, the industry representative believed. The subject was "attacked" on both sides of the Atlantic in slightly different ways with the AIFMD in Europe and rules established by the SEC in the US. Both initiatives aim at providing more information about hedge funds, but the reporting obligations differ between the EU and the US leading to different reporting mechanisms since it has proved impossible to use the same system to handle both sets of requirements (i.e. the AIFMD reporting and Form PF) .
Another industry representative pointed out the current fragmentation of OTC derivatives data. The goal of the G20 Pittsburgh Summit to bring transparency into the OTC marketplace has not been achieved. In fact, it has been a failure, the speaker believed and the general public would not understand why the necessary information is still not available when so many resources have been put into place to bring all the data on OTC derivatives together.
A key issue, the speaker stated, is that there is a proliferation of trade repositories. There are 27 trade repositories at present and at least two more are starting next year in Turkey. This will further fragment the data and make it all the more difficult for public officials to solve the problems. This might in the end defeat the purpose of trying to get real clarity and being able to bring true insight and transparency into the marketplace. A country will be able to use the data on a local level in order to evaluate the size of the local market and of domestic exposures and to address market abuse issues – such data are already being used that way - but it will not be possible to have a clear picture of all the exposures on the global level and anticipate where they are going to be moving in a time of crisis. This may cause similar problems to those seen in 2007/2008. It is therefore necessary to understand better why there is such a proliferation of trade repositories and to push the countries that are still in the process of adopting the G20 mandate to use the trade repositories that are in place now, as opposed to building their own.
The DTCC is at present the only truly global repository, the industry representative claimed. It operates on all three Continents: the US and Canada, Europe and Asia. DTCC repositories report in excess of 1.1 billion messages to regulators around the world each month and hold 42 million open positions. DTCC has advocated that there should be one global standard per asset class for the purposes of systemic risk mitigation, and has been working with CPMI and IOSCO in that regard submitting data sets that can be used globally.
Responding to a question from the audience about whether adopting single-side reporting in Europe could help to move towards more international coherence a regulator stressed that following a "big debate" the political decision was to go for double-sided reporting in EMIR. The rationale behind this is trying to guarantee that reconciliations are possible, that one can see how the trades actually come together and to avoid double counting. It however introduces enormous complexity, as the two reports must be reconcilable by having a single transaction identifier and making sure that the counterparty identifiers are right. In the securities financing transactions, single-sided reporting would probably have been sufficient. There is always a trade-off between complexity and the absolute security that the right reports ultimately go into the system. There must be a balance of the two, but on the whole, single-sided reporting can work when the right standards actually make sure that the reporting is clear.
Data sharing and coordination among public authorities
Legislation on data provision was mandated by the G20 but at the time there was not enough emphasis put on sharing and accessing the data, an official pointed out. The September 2015 G20 Finance Minister and Governors meeting in Ankara reinforced the objective to tackle the legal obstacles to sharing data between jurisdictions. That will however not be easy to achieve due to issues of data confidentiality and data security, which are real problems that do not exist just because national legislators are "blinkered". An industry representative agreed that how to protect the data and make sure the data are not wrongly used is an important issue to be addressed in this context, but developing more cooperation across jurisdictions is essential.
Data sharing is a major issue, a regulator agreed. Stronger coordination among the authorities involved in collecting data is also needed to facilitate its integration. Today, the underlying concepts, definitions, and standards are different from data set to data set, from authority to authority, from jurisdiction to jurisdiction, which makes the combination of data across jurisdictions very labour-intensive and subject to "all kinds of misinterpretations and overlaps". This is not the objective. Political commitment is required from all stakeholders to overcome these obstacles. Different constraints have to be taken into account including the need for data confidentiality. In addition authorities (i.e. supervisors, regulators, and central banks) have different mandates also but sharing data among all the authorities is essential.
The regulator reminded the audience of the Tietmeyer Report on international cooperation and coordination in the area of financial market supervision published in 1999 which led to the creation of the Financial Stability Forum (FSF) which later became the Financial Stability Board (FSB). One of the key areas of focus of this report was the development of data-sharing in order to "connect the dots" and understand what is going on in the global financial system. This has not yet been achieved 16 years later. It is necessary to create a "circle of trust" among authorities to encourage them to share data more easily and to overcome the current obstacles, the regulator believed. It can be politically difficult but this is essential for global financial stability and should be of the utmost priority.
An industry representative confirmed that the coordination between regulators is insufficient. Market participants, especially the large ones who operate globally in many markets are facing multiple deadlines in the different jurisdictions in which they operate and are not able to work with the different regulators in a coordinated manner. This multiplicity of objectives leads to losing track of the overall structure that would need to be built and often some parts of the data or of the structure put in place have to be discarded at a later stage. Regulators must be encouraged to try their best to coordinate at least the building of the infrastructures in order to limit duplications and inefficiencies.
Complexity of dealing with the volume of data and additional expertise needed
Finally another issue put forward is the complexity for the public authorities of dealing with the huge volume of data generated by reporting in the OTC derivatives space in particular. The data collected are indeed not easy to manage due to the sheer volume, an official stressed. For example on one European trade repository there is a daily flow of 25 million rows of data to process and aggregate. There is plenty of expertise in the world on handling "big data", but central banks and macro prudential authorities do not have it at present and it is necessary to start building it now.
4. Main actions proposed for improving the coverage and quality of market data
A large part of the discussion focused on possible solutions for improving the coverage and quality of market data. Several areas of improvement were suggested, part of which are already being addressed.
Implementing common identifiers and data standards
Standards are a linchpin in the discussion about data quality, an official stated. The panellists agreed that developing identifiers for entities, products and transactions is a key priority and that their implementation should be mandatory. Implementing such standards is possibly the most important part of the regulatory agenda following the G20 commitments, an industry representative emphasized. Standardisation is also a more "tractable" objective than some other issues related to data an official believed. The issue is that these are technical subjects which need political pressure for them to be achieved and avoid inertia.
Standard identifiers and data standards are essential to ensure the aggregation of data at the regional and global levels. This is particularly important for OTC derivatives, a regulator stressed. Key identifiers include Legal Entity Identifiers (LEI), Unique Product Identifiers (UPI) and the Unique Transaction Identifiers (UTI). Much work remains to be done to develop and implement these standards Global collaboration is ongoing in this perspective. The US OFR, the Bank of England and the ECB have jointly sponsored some workshops to work on these issues. A further workshop in October will focus on standards for granular data, such as those reported to trade repositories. It is absolutely critical that the public and the private sectors should be jointly involved in this initiative.
The process of implementation of LEIs is underway and is "easily achievable" an official claimed. 330,000 entities in 183 countries had LEIs as at the end of 2014, an industry representative pointed out. The FSB, CPMI and IOSCO are also collaborating to develop globally harmonised UPIs and UTIs. Some speakers on the panel however considered that the implementation process of these identifiers could be accelerated.
The use of these identifiers and standards must be cost-effective taking into account the implementation costs for the industry a regulator emphasized. Existing data standards must be relied on when they exist and can be useful, such as the ISO standards. It is not necessary to build everything from scratch. An industry representative suggested that some new concepts could be examined such as FIBO2 standards and see how they can be used to improve reporting processes.
ESMA is also taking an active part in standardisation efforts and has a clear commitment to make the outcome of the work conducted by CPMI and IOSCO actually happen in practice and to make the necessary changes to the existing reporting regimes.
An industry representative mentioned that some benefit would be found in involving certain organisations engaged in standardisation work such as the Object Management Group and the Enterprise Data Management Council in the seeking of solutions. This could also offer an opportunity to steer the work that is being conducted by these different bodies.
Ensuring the quality of the data collected
Based on the experience of the EMIR reporting, a regulator stressed that data quality and the validation of data are also massive issues. The "big bang" approach chosen for EMIR, starting from nothing and implementing reporting at the same time for OTC and exchange-traded derivatives was a major challenge. Since, a great deal of work has been done to try to improve data quality. This is ongoing and needs to focus further on validation. There have been a number of initiatives in the European derivatives framework to try to enforce the validation of data to make sure that the data that actually gets into the trade repositories is of the right quality and can be compared and used to report to the public authorities and also to the market on a regular basis.
An industry representative highlighted the need to really focus on some of the practicalities of data collection in order to optimise it. In the EU, there are various ongoing initiatives such as AIFMD, MiFID II, and MIFiR which all involve data collection. It would be valuable to examine what is already being collected to see if some of that data could be used so that the market does not have to build a wholly new structure.
Creating a global hub to enhance data sharing and international cooperation
Some lessons can be learned from the banking sector where many similar information sharing problems have been solved, although not perfectly, through the BIS an official suggested. For OTC derivatives the data sharing may necessitate an "international or global hub" that can ensure the confidentiality and security of the data. This cannot be achieved by a set of different cross-jurisdiction arrangements, the speaker believed. Such an international hub could also help to concentrate the expertise needed to manage the large amount of data generated by reporting, the speaker suggested.
Another suggestion made is to use a hub for metadata (i.e. the data that relates to the data, showing what data exist, where they are and what precisely they define) in order to facilitate the sharing of data.
An official added that developing a set of best practices on data sharing could be considered. The Irving Fisher Committee on Central Bank statistics has published some for Europe which could be expanded on a global basis. Having this international perspective is essential. In 2009, in the midst of the crisis, the OTC Derivatives Regulators Forum published a set of criteria for the derivatives market whereby public officials could have access to the data from jurisdictions and entities that they supervise, but such an approach is insufficient. There must be access to the global picture in order to see what is going on in global financial markets and best practices must be established for how to do that.
Responding to a question from the audience about the possibility to implement a single reporting regulation in the EU rather than a multiplicity of reporting requirements in the different capital markets legislations (i.e. EMIR, MiFID, AIFMD, MAR...) a regulator agreed that such a proposal would be "enormously" welcomed by the private sector. However, after the crisis, a whole set of different legislative proposals were being worked on in parallel with the result that reporting mechanisms are not necessarily completely aligned. Even if there are different underlying legal texts there is nevertheless a strong learning process in the way these regulations are being implemented and this will be the case for the reporting requirements of MiFID II following the experience gained with EMIR. In any case it is not certain whether a single reporting regulation would be feasible with all the different sectors, products and market participants that need to be covered i.e. banking, OTC derivatives, hedge funds, etc. A single reporting regulation might be a "useful ambition", but it would be a huge challenge to pull it all together.
Developing private public partnerships to manage the data
Public-private partnerships are a very important concept in an area where "big data" is a key element, a regulator considered. There seems to be a disconnect between the ongoing discussions in the private sector about the huge opportunities and value that collecting, processing and selling big data offers and the debate in the regulatory sphere about coordination problems and the difficulties of processing and accessing data on a global scale. The danger is moving towards a situation where market data are used and sold by the private sector without their use by the public sector for risk mitigation purposes being appropriately considered. The public sector should therefore be able to leverage on private initiatives and incentives in order to support the collection and provision of data to the public authorities and extract the social value out of the data in the same way that the private sector is able to extract the economic value out of it.
Several speakers on the panel agreed with the importance of building public-private partnerships in this context. Several areas where such partnerships could be beneficial were cited.
A first area is the development of standards. There are many private sector standards, such as ISO that already exist, a regulator stressed. It is not necessary to reinvent the wheel. Rather than starting from scratch in each different region existing global standards from the private sector that are already well accepted should be considered. This is always difficult from a political perspective, because one particular private sector solution should not be advantaged over another, but at the same time there is a need to be pragmatic and to try and use what already exists.
In terms of who sets the standards, a model such as ICANN (The Internet Corporation for Assigned Names and Numbers) seems interesting to examine, an industry representative believed, in order to determine whether it would have any applicability for solving current financial data problems. ICANN was established in the 90's at the behest of the US Government authorities as a public-private partnership with representation at all levels to structure internet domain naming which was very disorganised at the time. Standard setting brought order to the internet and helped spur an information revolution. The ICANN precedent shows that consistent data and an effective collection process can be achieved with a joint process involving the public and private sector.
A second area is to increase analytical capacities. When it comes to analyzing the data public-private sector collaboration is also relevant, the regulator suggested. The public sector will always be constrained in its capabilities and abilities to look at all the data, therefore strong contributions will be needed from the private sector in analyzing the data and also making sure it is properly used. A market observer agreed that the private sector could contribute to the analysis of market data. There are however some areas where public sector leadership will still be required. For example in the derivatives area the website that has been put in place showing consolidated claims of banks across countries would probably have been difficult to set up without strong public sector leadership. An industry representative suggested that academics could also contribute to the analysis of market data and provide insights and that it would be useful to make sure that data are accessible to them also.
Need for political impetus
Several panellists stressed the need for a strong political impetus to improve market data and solve data aggregation and sharing issues. This is a "tough political problem" but one that needs to be dealt with urgently, an industry representative emphasized.
The current gaps will not be resolved unless there is "fierce pressure" from the Financial Stability Board, demanding deadlines and if this is not done the discussion will continue to revolve around the resolution of various privacy, technical and commercial issues, a regulator claimed. Political impetus is needed with acceleration of the work underway on LEI, UPI, and UTI, and extra pressure on the technical elements as there is still a poor understanding of how the global financial system works. Solving these issues requires political impetus, ambition, tough deadlines and speed.
A regulator emphasized the importance of raising awareness about the need to close the data gaps and improve data aggregation, as well as creating the political momentum required to complete the project, given the difficulty there is to "bring people together" on such a project.
Continued "political push" is needed to make progress in the many areas needed to improve the quality of data – i.e. common understanding of the main characteristics of what must be reported, harmonisation of the data elements, adjustment of the differences and inconsistencies amongst different systems and methods used for data collection, development and implementation of the key identifiers - otherwise micro and macro-prudential supervisors will not be much better placed to monitor some of the risks in a few years' time.
Public intervention in planning and testing to ensure proof of concept
"Proofs of concept" are essential when tackling data collection, an official believed. In the US repo project for example, lessons were learnt from pilot projects regarding the way data are collected. This has helped to understand precisely what the data represents and how to collect it best in order to avoid adjusting the standards and minimise costs.
Developing an implementation roadmap with short and longer term actions is essential for ensuring the timeliness and cost-effectiveness of the process and also its consistency, an industry representative emphasized. In doing so the time needed for the private sector to build the systems required for data collection and for the public sector to develop the processes for analyzing the data should be taken into account.
Public officials have been successful in establishing timeframes in the past. This has been done successfully in the Eurozone by the ECB for example for aligning settlement dates on T+2. A similar objective was pursued in the US a few years ago by the private sector but it never succeeded.
A market observer stressed that dealing with the new issues on which data would be needed may be like trying to hit a moving target because the financial sector is very good at innovating. Therefore, it is necessary to be realistic about the degree of information that is feasible to obtain, the timeliness with which it will be provided and the degree to which it remains relevant. In some cases public policy has led to developments, for example in getting more OTC derivatives onto central counterparties or with IOSCO's new disclosure rules published this year. But in other cases, public policy seems to be slightly "being behind the curve". This is the case of shadow banking. A great deal of effort went into better understanding this issue but it may turn out not to be as problematic as first thought.
1 This report used by the US FED collects basic financial data from a domestic bank holding company (BHC), a savings and loan holding company (SLHC), and a securities holding company (SHC) on a consolidated basis in the form of a balance sheet, an income statement, and detailed supporting schedules, including a schedule of off balance-sheet items. It is a primary analytical tool used to monitor financial institutions between on-site inspections.
2 The Financial Industry Business Ontology (FIBO is a business conceptual ontology standard providing a description of the structure and contractual obligations of financial instruments, legal entities, market data and financial processes. The primary application of the business conceptual ontology is for data harmonization and for the unambiguous sharing of meaning across data repositories. This common language for the financial industry supports business process automation and facilitates risk analysis.FIBO is a collaborative effort among industry practitioners, semantic technology experts and information architects.
By S. M. Gavell - Executive Vice President, Regulatory, Industry and Government Affairs, State Street Corporation
By L. Thompson - Vice Chairman, The Depository Trust & Clearing Corporation (DTCC)
By C. Ellis, Chief Credit Officer, EMEA, Moody's Investors Service
By V. Ross, Executive Director, European Securities and Markets Authority (ESMA)