Regulatory coordination needed improving at the global level despite significant progress
Significant progress had been made regarding global cooperation since the crisis, but many issues remained to be addressed.
A regulator stressed that the communiqué from the 2009 Pittsburgh G20 meeting had been a remarkable statement. The spirit of cooperation read through virtually every paragraph, with encouragement to avoid balkanisation and protectionism. However, in the six and a half years since, the spirit of cooperation had been subsiding; regulators, notably in the US, had been operating somewhat unilaterally. A significant step forward had been achieved regarding equivalence determination in February 2016 in the area of CCPs, which was a positive sign. There was also a general agreement among regulators that more international cooperation was necessary. But countries had different histories and legal systems, which made harmonisation difficult. The backdrop of this situation was substandard growth in which the lack of coordination on global rules played a part. Maintaining a strong global financial services market was essential for economic growth.
Two main issues had to be addressed in order to achieve greater cross-border cooperation between Europe, the US and Asia. Political risk was a first issue: regulators were concerned that deference to other jurisdictions would expose them to a greater risk of being challenged by politicians if investors lost money. Loss of sovereignty was a second concern, with global standards often giving rise to defensive postures on the part of domestic regulators.
The issue, however, was that these problems would not go away in a world driven by a greater degree of populism. A cultural shift was needed, and regulators had a role to play in this, to remind the marketplace that investment in securities markets was essential to prosperity but that this would not go without risks. Market risks should not systematically lead to a political crisis or to investigations. Systemic risks and manipulation were a different issue and regulators were there to police these risks. Making sure that regulators could continue to work together and finding global ways of addressing market issues was also essential in a context where issues were increasingly being looked at in a similar way around the globe, some panellists emphasised. Further developing a culture of compromise among regulators was also necessary.
When considering which areas of regulation could benefit from more global cooperation, OTC derivatives were an obvious candidate, because of the global nature of the market. Repo markets and the related reporting were another area worth considering.
IOSCO had produced a very detailed paper at the end of 2015, laying out three main approaches to regulatory co-operation (national treatment, recognition and passporting) that would be completed later in the year with a next-steps paper on cross-border regulation. IOSCO was also increasingly using a toolbox approach in its different initiatives, but this raised potential implementation issues. Toolboxes were a set of options but did not establish a standard. Moreover it was not yet clear how equivalence assessments could work with existing vested domestic interests.
Progress had been made with TRs to improve market data
Improving market data was essential since a lack of transparency on the counterparty credit exposures of large swap dealers had been at the heart of the financial crisis. Significant progress had been made towards making the derivatives market more transparent with the implementation of Trade Repositories (TRs).
TRs had emerged as a key tool for identifying and monitoring systemic risk through the collection and maintenance of derivatives data. They were supported by significant investments and were now receiving and reporting massive volumes of data on derivatives markets. Key questions however remained about whether TRs were achieving the G20 mandate for measuring systemic risks; the usefulness of the data was indeed limited by the fragmentation of TRs at the global level, the difficulty of sharing data across jurisdictions, as well as data standardisation and quality issues including formatting, completeness and accuracy. The implementation of TRs had been so far mostly domestic, with a focus on the surveillance and identification of risks in individual domestic markets - TR data was being successfully used in Europe by the ECB and the ESRB and also in Singapore and Hong-Kong for example - rather than on the monitoring of systemic risk on a global basis, as had originally been planned by the G20.
Capital market data required further harmonisation at the international level
Achieving full consistency of data was a daunting task; therefore an incremental approach had been chosen in order to progressively expand the group of players using standards and a harmonised dataset. Establishing a target for a global data architecture was however necessary, not to lose track of the longer term ambition to achieve more complete harmonisation; swaps data was the number one priority in this regard, some speakers emphasised.
Data standardisation was a win-win improvement for regulators and the financial industry. The importance of data was recognised both by regulators and the industry, notably for systemic risk measurement purposes. There were however huge costs associated with the collection and processing of data, most of which was performed by the industry, in a context of lower margins and higher regulatory costs. Further harmonisation was essential for improving the efficiency of these data-related processes.
It had been hoped initially that standards would first have been defined globally and then implemented domestically, but the implementation had been faster than expected and domestic jurisdictions had established rules before international standards were available. There was a real need to look at data harmonisation issues globally, in connection with organisations such as CPMI-IOSCO, before implementing some of the detailed level requirements at a national level, because data was difficult to improve ex-post. Ideally, a single standard-setting authority could be in charge of establishing standards at the global level, monitoring their adoption and the related rulemaking at the domestic level and ensuring compliance with them, a panellist suggested.
Progress was being made towards further standardisation, in particular through international efforts to establish standard identifiers such as the Legal Entity Identifier (LEI). The adoption of LEIs was very encouraging; more than 430,000 LEIs were registered in Europe and some institutions had about 30% of their counterparties already using one. However, the type of data hierarchy to be used with LEIs had to be defined, since an accounting hierarchy did not work for all parts of the industry, a panellist believed. Moreover, 20% of LEIs had already lapsed, and it was a challenge to keep pressure up to renew them each year; this required cooperation between regulators and the industry. A broader adoption of identifiers, such as securities identifiers, should also be planned in a second stage.
The importance of good quality data was emphasised
Sufficiently clean and good quality data was essential for analysing, sharing and using data collectively across jurisdictions.
Improving the quality of data was a main focus of ESMA in particular, both at the domestic and EU levels. Cleaning data so that it could be used effectively was however a difficult and time-consuming task. It was important to draw common experiences from the implementation of EMIR, MiFID and SFTR reporting in this perspective, a speaker suggested. EMIR data was not easy to use, and there was the need to gather more information than had initially been defined, because the understanding of risks had evolved. Expectations had to be well managed in such a context, taking into account the tools that were available. Involving the users and providers of the data in the definition of objectives was also important in order to check the feasibility of data collection and analysis.
Another focus was the measurement of systemic risks. This required ensuring that sufficient data sources were available for assessing the interconnectedness between the banking sector and other parts of the financial sector and for monitoring financial stability in the capital markets, which was increasingly important with the CMU objective to diversify the financing of the EU economy. This was quite challenging because the capital markets had a hugely broad horizon. Trying to get reliable data from across the range of different market participants and activities was quite difficult; the attempts made so far had not been very successful. Supervisors needed to work collectively to assess risks in the overall financial system and how they were moving between its different components. In any case, spotting the next crisis was probably too ambitious, so supervisors had to make sure at least that the last crisis would not repeat itself, a panellist stated.
Data governance and access to data at the international level also had to be improved
The FSB target to remove all barriers to sharing data by the middle of 2017 was an important step towards the further harmonisation of derivatives data reporting. It was suggested that a data governance and access framework was urgently needed. This would ensure that data standards were maintained and updated, as markets and regulatory requirements evolved, whilst providing a formal structure for the appropriate sharing of and access to data across jurisdictions.
Data privacy and transference was an emerging issue to be considered in this regard, with ongoing data protection initiatives in Europe and Japan for example. There was concern about whether data would be handled in an appropriate manner if such rules developed further. Moreover, while some regulatory obstacles to the sharing of data had been removed (for example the indemnification provision in the US), others remained, such as Article 75 of EMIR relative to the recognition of TRs authorised in third-countries.
The Chair introduced the topics of the session. The session would discuss the further regulatory coordination that could be achieved at a global level. Improving global regulation and cross-border implementation of regulation was very important. A great deal of progress had been made in terms of the availability and sharing of data, as well as being able to monitor the data for stability reasons. The focus would be on capital markets, particularly on OTC derivatives.
1. Regulatory coordination at the global level
Progress made with global regulatory coordination
A regulator stressed that the communiqué from the 2009 Pittsburgh meeting of the G20 had been a remarkable statement. The spirit of cooperation read through virtually every paragraph of the agreement, which in the wake of the financial crisis was very dramatic. Reforms had been initiated on a regional and national level to respond to the financial crisis, due to the way regional and national governments were formed around the world. However, the G20 did not want to bring about an end to a global market in financial services and repeatedly through the communiqué regulators had been encouraged to operate with the spirit of cooperation and avoid regional issues, Balkanisation and protectionism.
In the six and a half years since the Pittsburgh meeting, that spirit of cooperation had however been subsiding. The CFTC had given a mea culpa for its own contribution to some of that disharmony, which was a first step in improving that atmosphere. In the wake of the Path Forward Agreement, the CFTC had indeed operated somewhat unilaterally in the imposition of mandates for swaps execution, and Europe had been rightly critical of some of those efforts by the CFTC to impose its own swaps-trading regime on non-US participants in swap markets.
Since then a new Commission had been in place at the CFTC, and they had worked well with the international community to step back from that unilateralism. It had been a long and hard road to reach an equivalence determination in the area of CCPs, but the position that had been reached in February 2016 was an important statement of an approach to those issues in alignment with the Pittsburgh Accords. This was a principle- and outcome-based, approach to the implementation of some of the key regulatory mandates that had come out of the financial crisis.
An important backdrop to this conversation was the state of the global economy. Substandard growth had been seen over the eight years since the financial crisis, with sluggish economic growth, sharply diminished global trade, stagnant living standards and growing political uncertainty, across all countries and regions. This was a direct result of a low-growth world, and coordination on global rules played a part in that. It was vitally important, in recognising the Pittsburgh Accords, that the globalisation of financial services was not ended, and that a strong global financial services market was maintained rather than a world of fractured liquidity and balkanised markets.
Thinking about coordination of rule sets, it needed to be remembered that it was about more than sets of rules; the key aim was a strong, vibrant global economy.
Remaining issues and questions
In trying to understand why there was not greater cross-border cooperation between Europe, the US and Asia, there were two areas where focus was needed, a regulator stated: the political risk and the loss of sovereignty. Until these two areas were solved, there would never be true global cooperation.
On the political side, the issue was whether a mutual or unilateral recognition regime or some sort of deference to the regulatory regimes of other jurisdictions was possible. Capital market regulators had a mandate that included investor protection, and on the first occasion that something went wrong and an investor was harmed, either through a loss of money of through fraud, the politicians would call in the capital markets regulators and rake them over the coals, asking how they allowed it to happen. Until this problem was solved, the desired endpoint of deference would never be reached.
The second problem was loss of sovereignty. The issue is that every country and every regulator believed that it had the right approach. The challenge was that when global standard-setters proposed recommendations or principles regarding a particular issue, before individual countries had decided what avenue they were going to go down, there would often be a very defensive posture in these countries against allowing a third party to dictate what the standards should be in their country. Until and unless this problem was solved, there would never be a global cooperation spirit of the 2009 G20.
The Chair emphasised that although there was wide agreement that more international cooperation around regulation was positive, translating this objective on the ground into the reality of different jurisdictions was not easy and often those two issues of political risk and sovereignty came to the surface. Reality checks were something that regulators tended to forget sometimes, especially in Europe where there was a fragmented history and different legal systems. This was just a fact of life; they tried to move towards a nirvana of global cooperation, but there were hurdles to overcome.
A regulator added that regarding the issue of international cooperation, all countries, big or small, had specificities, which reflected to a large extent national differences in terms of legal issues and the way that society was designed. That made harmonisation incredibly difficult.
Another regulator agreed that political risk and sovereignty were excellent statements of the problem. The issue, however, was that these problems would not go away. If anything, as they entered a world driven by a greater degree of populism, an event of losing money was almost too attractive for politicians not to politicise. A cultural shift was needed, and regulators had a role to play in this, to remind the marketplace that investment in market risk was essential to prosperity. There could not be growth without risk. With risk there would be times when investors lost money, but this could not be a political crisis or the cause for a regulatory investigation every time. If Western economies were ever going to return to growth again, regulators needed to remind savers that investment involved risk, and there would be a loss of investments from time to time. This was not a crisis; a real crisis was a systemic problem or a market manipulation, and regulators were there to police these risks.
Increasing international cooperation
The chair mentioned that Dutch MEP Cora van Nieuwenhuizen had outlined four basic principles of global rulemaking and co-ordination in her contribution to the Eurofi magazine that were worth considering going forward.
The first principle was that global regulators and other jurisdictions needed to have greater involvement in the EU legislative process in order to focus more attention on third-country issues than was currently the case. The second principle was that comprehensive new regulation should be put in place only after international work had been completed and after a clear transposition commitment had been received from other jurisdictions. The delay in the introduction of the European CCP recovery and resolution legislation was a welcome development in this regard. The third principle was that, where global implementation was limited but EU legislation was deemed necessary, individual recognition on the basis of international principles needed to be an option, as in the new Benchmarks regulation. The final principle was that the gold-plating of international standards must be minimised in Europe for example by the use of regulations instead of directives and by pushing for more granular global standards. Gold-plating should especially be opposed by EU institutions that were present during the international negotiations.
A regulator considered that the issues were being looked at in a very similar way around the globe, because ultimately it was the global financial markets that were being discussed. Much of what had been said about the US approach to cross-border regulation was also true of Europe. The Pittsburgh Summit of the G20 had been very important for the whole reform agenda to have high-level principles clearly laid out, as well as the follow up of the FSB; this had allowed putting an overarching framework in place as to what needed to be achieved. The important element was the need to make sure that regulators could continue to engage and find global ways of addressing these issues, before getting into the detailed national implementation. This was something that had not always worked perfectly in terms of timing, and that should be dealt with.
When considering which areas of regulation could benefit from more global cooperation, OTC derivatives were an obvious candidate, because of the genuinely global nature of that market, a regulator suggested. Repo markets and the coordination of the related reporting were another important area to consider.
Another regulator emphasised that IOSCO had produced a very detailed paper at the end of 2015, which had laid out three approaches to the idea of regulatory cooperation (national treatment, recognition and passporting). The CFTC and the EU deserved great credit for taking that spirit of cooperation to a very significant next step. IOSCO would be coming out with a next-steps paper later in the year on cross-border regulation.
A member from the audience noted that the toolbox approach was an important element of global cooperation. When it came to third-country equivalence, one of the things that had been lacking was a mechanism to ensure flexibility but also legal certainty. A question was whether it was now the time for the CFTC, the European Commission and others to consider a standardised toolbox that could be used for equivalence, which gave the flexibility to choose the right tool for the right piece of legislation, but with sufficient legal certainty and transparency during the process.
A regulator answered that there was lately a move within IOSCO towards using toolbox approaches in many of its consultations and final reports. It was a double-edged sword in some respects, because a toolbox was ultimately a set of options; it was not a standard or a requirement, or even guidance. On the issue of equivalence, not as much progress had been made as would have been liked, but that was for a variety of different reasons. It was not clear how far one could go with equivalence assessments. There were many domestic vested interests in not compromising on these sorts of issues but without compromise, it would be difficult to implement a toolkit approach. There was no satisfactory answer for the moment, unfortunately.
2. Data standardisation and sharing
The key role of Trade Repositories (TRs) and related issues
An industry representative stressed that significant progress had been made towards making the global derivatives market safer and more transparent. TRs had emerged as a key tool for systemic risk identification and monitoring, through the collection and maintenance of derivatives data, and for providing transparency into a marketplace that was previously very opaque.
Work however remained to be done. The development of TRs was recent since the first ones were implemented three and a half years ago. Whilst there was much better access to information thanks to TRs, the focus had been so far mainly on the surveillance and identification of risks in individual domestic markets, rather than globally as had originally been planned by the G20. TR data was being used for example by the European Central Bank and the European Systemic Risk Board. In Asia, the Monetary Authority of Singapore was doing some interesting analysis, as was the Hong Kong Monetary Authority on both the FX and rates reports provided. There was therefore some usefulness to it, but it was more difficult than it should have been. In this context, DTCC was the only truly global TR, the speaker claimed. This was not an issue of insufficient investment; hundreds of millions of dollars were being spent every year. Everybody wanted the TRs to work and to avoid the repetition of 2008. This was also one area where people understood what needed to be done. It was, however, frustrating that following the G20 vision and principles, the implementation had then gone into different directions with individual regulation, interpretation and creation.
Although TRs were receiving and reporting massive amounts of data to officials and to the public1 , key questions still remained regarding the usefulness of the data reported and whether it was achieving the G20 mandate. The FSB noted last year that the majority of its member jurisdictions had trade-reporting obligations, but the usefulness of reported data was limited by data-quality issues, including formatting, completeness and accuracy. Critical to the discussions about the usefulness of the data collected was also the challenge of coming to a global agreement on how global systemic risk should be measured. The worst thing that could happen would be for the TRs to be viewed as non-essential, impractical or useless in the end, and for the opportunity to be thrown away.
Improving the harmonisation of capital market data was a key issue to be addressed in order to improve its use at the global level.
An industry representative mentioned that a list of 30 basic fields to be aligned globally in order to enable effective systemic-risk oversight had been proposed in the CPMI-IOSCO Data Harmonisation working group. Beyond this, two concepts were essential to improve harmonisation. The first of these was ‘compromise’, for example giving away one data field in exchange for another. The second term was ‘ego’, which should be left out, because nobody had a monopoly on intelligence. Regulators were moving towards adopting the same basic philosophy, i.e. increasing the level of specificity or granularity of applicable standards, but they still needed to ensure that the same standards were being adopted around the world.
Progress was being made in particular through recent international efforts to establish standard identifiers including the Legal Entity Identifier (LEI). Another industry speaker considered that when the LEI concept had appeared in Dodd-Frank, it had probably been one of the most important long-term parts of the legislation. LEIs were currently being implemented; around 30% of the 70,000 counterparties of State Street for example now had one.
There were however several issues inhibiting the development of LEIs. The first one was the question around data hierarchies, which they had tried to sort out. There had been a consultation and response on that. Adopting an accounting hierarchy did not work for all parts of the financial industry. It worked well in the corporate finance industry, but in other sectors such as investment funds there were difficulties inputting the data. The second issue was that 20% of LEIs had already lapsed, and it was a challenge to keep pressure up to renew them each year. Cooperation between regulators and the industry had to continue to solve this problem.
The FSB and the LEI groups had chosen an incremental approach, which was the appropriate one, the industry speaker believed. It was important to get as many people as possible involved to begin with, and then to expand the dataset progressively. However, whilst there should be focus on short-term incrementalism, it was important not to forget the long-term and to be ambitious. Securities identifiers were another area that deserved attention once derivatives had been covered. The challenge was defining a global target architecture and at the same time working on a day-to-day basis to solve incremental problems.
A regulator mentioned that it was necessary to recognise that achieving full consistency of data was a very daunting task. There was a need to be modest in terms of aspirations of what could be achieved given the challenges at hand and the specificities of different local markets. Whilst aiming for a long term goal of maximum consistency, they should also be willing to accept gradual progress.
A regulator suggested that existing standards and frameworks could probably be used for standardising data more quickly and efficiently, rather than inventing new ones. The establishment of the LEI had been a real success; they had about 430,000 entries registered in Europe with an LEI, but progress was still needed in other areas.
There were huge costs associated with data collection and processing, an industry representative pointed out and further standardisation was a win-win improvement for regulators and the industry because it would help to take a huge amount of cost out of the system over the next few years. Data was very important for both the industry and regulators and the industry understood that improving data was important for systemic risk measurement and was supportive of that. The issue however was that the industry was providing a large share of the data and supporting a large share of the costs in a context where margins were under pressure in a low interest rate environment, with increased regulatory costs. Improving efficiency was the only solution and data standardisation was a key element of it.
The hope initially was that international standard setters such as IOSCO would define standards at the global level and that national regulators would then implement them, the industry speaker continued. However with the financial crisis, things had happened more quickly than expected and not in the expected order, with domestic jurisdictions establishing rules in differing ways, particularly for derivatives, before international standards were available. This was understandable, because the data had been needed quickly, but the price was now being paid for this approach.
A regulator agreed that there was a real need to look at data collection and harmonisation issues globally and to do this upfront, before implementing requirements at a national or regional level. Global organisations such as CPSS-IOSCO could do that. Experience had shown that it would indeed be more difficult to re-fix afterwards. Ideally, there should be a single global authority in charge of defining the standards, monitoring their adoption and the related domestic rulemaking and then ensuring compliance, an industry speaker suggested. Such a process that had been successfully adopted by the Basel Committee and CPMI-IOSCO for monitoring the implementation of the PFMIs (Principles for Financial Market Infrastructures) was something that could be extended to global data standards.
Data governance and access
The FSB’s target to remove all barriers to sharing data by the middle of 2017 was an important step towards further harmonisation of reporting of derivatives-data, an industry speaker believed, as it would bring the issue to the forefront. A framework for data governance and access was urgently needed in this context in order to truly realise the goal of global data harmonisation. This would ensure that data standards were maintained and updated, as markets and regulatory requirements evolved, whilst providing a formal structure for the sharing of and access to data across jurisdictions for systemic-risk oversight purposes.
Another industry representative emphasised data privacy and data transfer issues. There was an EU data protection initiative coming up, and other juridictions like Japan had strict rules. There was, rightly, concern in different jurisdictions about the handling of market data and whether data was properly disclosed, but this also raised potential access issues.
A member from the audience noted that much work had been done to improve data harmonisation, but there were still regulatory obstacles to the sharing of data. In the US, Congress had repealed the indemnification provision that had been a barrier to foreign regulators accessing data from the US swap data repositories. In EMIR, however, Article 75 still blocked the sharing of data unless there was an international agreement. The understanding was that an MOU (Memorandum of Understanding), which would be entered into with a European regulator, would not qualify and therefore a Treaty change would be needed which would take many years.
A regulator agreed that tackling the issues related to Article 75 was difficult given that a Treaty change might be needed. This is an issue for the legislators. It was agreed that it was extremely important to make sure that everyone could share data and have access to each other’s TR data.
Data quality and usefulness
Several speakers emphasised the importance of clean and good quality data.
A regulator considered that some good use had already been made of TR data, but it was clearly not as clean and of the quality that it should be, which hampered its use and how easily the analysis could be done with it, as well as how easily the data could be shared and used collectively. Improving data quality both at the EU and domestic levels was a focus of ESMA. It would be important to draw common experiences from the implementation of EMIR, MiFID and SFTR reporting requirements (what could be achieved with the data, the level of quality obtained) and build commonality across the corresponding reporting mechanisms in order to use the data as effectively as possible. Reporting was indeed extremely costly for the industry and regulators.
Another regulator agreed that clean data was essential. Cleaning data however definitely took more time than analysing it and was cumbersome. Another issue was that defining what data was needed for identifying potential systemic risks in a financial stability perspective was very challenging. Analysing banking data over the last 25 years had indeed not been sufficient for supervisors to anticipate the financial crisis.
An industry representative emphasised the importance of clean data by giving one example. DTCC had created a tool to produce stress scenarios, the Trade Information Warehouse, covering primarily credit default swaps. This tool could be used to produce stress scenarios based on simple information. It had been shown to some regulators around the world, who had found it very useful, but it required clean data, meaning that the TR data could not be used due to the current quality problems.
Another regulator confirmed that EMIR data was still not easy to use. In addition, there was a need to gather additional information in order to understand what was going on. This was because the understanding of risks was evolving all the time. Despite the extensive ongoing data collection exercise, there was the need to look, at least occasionally, at things which had not been thought about initially.
The importance of managing expectations and establishing priorities was also highlighted.
A regulator stressed that expectations needed to be managed regarding what could be achieved in terms of data with the tools that were available. It should be clarified that, whilst supervisors would do their best, history had shown these efforts did not always succeed, even if they were well worth it. This would happen again.
Another regulator emphasized that concerning data, they needed to go back to first principles and remember that at the heart of the financial crisis was a lack of regulatory transparency in the counterparty-credit exposure of large swaps dealers from one to the other. This was why the Dodd-Frank law had established swap data repositories. There was a great deal of data that everybody would like to have, regulators would like to have as much data as possible, but they needed to start with the number one priority, which was swaps data. It was important to remember priorities, before working out the secondary and tertiary issues.
A regulator added that involving the users and providers of data early on was an important point, in order to ensure that financial and systemic risk data could be collected and appropriately analysed. Looking at the feasibility of collecting the data and anticipating what could be done with it was extremely important.
The regulator also emphasised the importance of monitoring the interconnectedness between the banking sector and other parts of the financial sector. Moreover, with the objective of the CMU to transfer part of the financing from banks to capital markets, there was an increasing need to monitor financial stability in the capital markets. But doing this was quite challenging because capital markets had a hugely broad horizon. Trying to get reliable data from across the range of different market participants and activities was incredibly difficult. Attempts had been made to take steps in that direction, with the AIFMD and the MiFID reporting in particular.
Better understanding how risk was moving between the different parts of the market and the interconnectedness between different parts of the financial sector was not an easy task. Even the banking supervisors, with a much better and more established data set than was available on the capital and securities markets front were still struggling with that. The only way in which progress could be made was by having the different supervisors assessing risks in the overall financial system together. Assessing interconnectedness also involved bringing in some representatives of the buy side and of the sell side. Baby steps were being taken towards improving the monitoring of risks in different parts of the financial system. These efforts needed to be combined, which would take time.
An industry representative considered that spotting the next crisis was difficult. By its nature, it would be something that was unexpected. However, what was needed was to avoid repeating the last crisis. Until and unless this was gotten right, there would be another AIG that nobody would have seen building up, and they would all be left wondering how they had missed it. Everyone was aligned on this, and there needed to be cooperation between the industry, the repositories and the regulators. This cooperation would be a big step in the right direction.
1 A speaker explained for example that DTCC had 5,500 clients in more than 30 countries, supporting 3,000 data elements and they processed 300 million messages a week. Assuming that each of those messages was 100 data points, this was 30 billion data points a week, 100 billion data points a month, or at least a trillion data points a year.