top of page

Camelot Executive Breakfast Roundup: Smarter Use of Property Valuation Data in a Softening Insurance Market

  • Writer: Camelot Consulting
    Camelot Consulting
  • 2 days ago
  • 5 min read

Updated: 1 day ago

Date: 9 October 2025 Venue: Searcys at the top of The Gherkin, London Sponsor: e2Value, Inc. Moderator: Hélène Stanway Contributors: Rob Proctor (CUO, Vave), Todd Rissel (CEO, e2Value), Skip Coan (SVP, e2Value)


Overview


As the London Market continues to soften, pricing confidence and portfolio differentiation are becoming defining capabilities for property insurers.


At our latest Camelot Executive Breakfast, hosted at Searcys with panoramic views across the City, we brought together senior leaders from underwriting, pricing, actuarial, and data functions to test one fundamental question:


“Wildfires and floods don’t care about AI, structure valuation, or data; should you?”

Accurate property valuation assessment plays a critical role in insurance, as companies rely on key information to ensure proper coverage and effective risk management.


If as much as half of commercial structures are unclassified or unknown in key datasets, how do carriers build reliable views of values at risk and price adequacy? Companies must gather key information for each submission to ensure accurate property valuations and avoid gaps in coverage.


The discussion, chaired under the Chatham House Rules, explored the interplay between valuation data, underwriting discipline, and capital-aware pricing, drawing on a practical case discussion from Vave alongside perspectives from e2Value.


The Core Theme - Property Valuation Data in Insurance


Even with rapid advances in data and analytics, too many property risks are still evaluated on static or COPE inputs. A detailed and complete set of property data is essential for accurate valuation and pricing in insurance. The room agreed on a clear direction of travel: augment traditional underwriting with more dynamic, location and asset-specific data, and connect that data into an end-to-end pricing model that includes cost of capital, reinsurance costs, claims intelligence, and exposure management. Leveraging technology is key to improving the accuracy of property valuations and ensuring comprehensive coverage. The ambition is not “data for data’s sake,” but decision-grade inputs that raise confidence at bind and improve portfolio quality over time.


Key Takeaways


Build a 360° pricing model... then automate it.


Pricing should not stop at exposure and historical loss performance. Carriers that integrate cost of capital and reinsurance costs alongside claims and exposure, and integrate those insights back into underwriting, enable faster decision making and a more accurate calculation or assessment of the total cost of risk. Integrating insurance claims data also helps inform risk assessment leading to more accurate claims settlements for policyholders, reducing the risk of underinsurance.


Granularity plays a critical role in proper coverage... ZIP codes won’t cut it.


ZIP/postcode-level data is often not granular enough to reflect wildfire, flood, and other perils, particularly outside North America where valuation detail is frequently inconsistent and missing. Asset-level data (location, construction, occupancy, protection, and exposure) is table stakes for modern models. Detailed appraisals and accurate property values are essential for determining proper coverage and replacement costs, ensuring that insureds and insurers have adequate cover for their risk needs.


Human touchpoints are fragile... protect the data for better accuracy.


Every manual interaction at the point of transaction risks degrading data quality. The push is towards data standards, automation, and consistent delivery so underwriting systems see clean, conformed, and consistent insights. Standardised reports and schedules of values improve compliance with insurance requirements, helping to prevent incorrect valuations and coverage gaps.


Property lags motor on usable data in the insurance industry... yet the stakes are higher.


Motor underwriters have more granular and consistent data for motor than property underwriters do for large and complex property risks, even though property loss frequency and severity implications can be far more severe. Closing this gap is a risk imperative. Businesses and policyholders benefit from accurate insurance valuations and regular reinstatement cost assessments, which help protect their investment in properties and assets.


AI’s window is open... if your data house is in order.


AI can deliver step-change improvements in underwriting analysis, productivity and segmentation. But without consistent, timely, and standardised inputs, AI risks making incorrect or misinformed decisions. Technology and better accuracy in valuations enable insurers and underwriters to use data more effectively for risk management, supporting proper valuation, and thereby more effectively manage their total cost of risk.


Accurate Property Valuation is dynamic... treat it that way.


Rebuild values fluctuate through the year with supply chain activity, inflation, and loss/ catastrophe impacts. Combining COPE data + forensic accounting + engineering provides a truer view of values at risk, improves vulnerability curves and produces accurate, more resilient pricing. Material costs, building characteristics, and the critical role of professional appraisal practice, ensure proper valuation and coverage, so that insurance policies reflect current replacement costs.


One carrier can’t standardise the market... leaders can raise the bar.


Market-wide consistency needs treaty underwriters and data vendors aligned on data standards which enable APIs and in turn enables consistent delivery of data. In a bifurcating London Market, Lead underwriters have a responsibility to lift data expectations; brokers can then place algorithmic followers with greater certainty. The importance of data standards, sector-specific intelligence, and the relationship between brokers, underwriters, and clients is key to achieving the right coverage and optimising the total cost of risk.


Case Discussion: Practical Lessons from Vave


Vave led a case discussion on using augmented property data to enhance risk selection and rate adequacy. The key takeaways reinforced the group’s broader conclusions:


  • Start with completeness and consistency. Standardise the variables that underwriting actually uses; don’t swamp underwriters with “nice to have” variables. Ensure a complete set of data and comprehensive reports are available to support accurate underwriting decisions and property appraisals.

  • Engineer features for decisions. Create model-ready fields that map cleanly to pricing, aggregation, and treaty reporting. Leverage technology to enable more accurate and efficient appraisal processes, improving data validation and risk assessment.

  • Close the loop. Feed claims development and reinsurance signals back into class-level and portfolio-level pricing, not just case pricing. Underwriters rely on accurate reports and appraisals to inform pricing and risk management, ensuring dependable and optimised coverage.

  • Reduce manual friction. Push for straight-through, API-first data delivery from vendors and intermediaries.



The Strategic Edge for Risk Managers


With rate pressure intensifying, carriers need pricing confidence that holds under external scrutiny, from internal governance to treaty negotiations. Accurate property valuation data delivers significant benefits for both insurers and clients, including more accurate risk assessment and improved efficiency. Insurance valuation ensures that coverage levels are based on the true reinstatement or replacement cost of assets, supporting optimal premium calculation and protecting the organisation’s investment by reducing the risk of underinsurance or overpaying for coverage. Firms that institutionalise dynamic valuation data and connect it directly to pricing, capacity deployment, and reinsurance strategy will:


  • Price faster with fewer handoffs (improved quote ratios and cycle times), as accurate property valuation leads to more accurate premiums and safeguards organisational investment.

  • Segment more effectively, avoiding unpriced accumulation.

  • Earn credibility with treaty markets, aligning on consistent exposure and valuation definitions.

  • Free underwriters to focus on judgement calls, not data collection and manipulation.


Acknowledgements


Our thanks to e2Value for sponsoring the session, Hélène Stanway for moderating an energetic discussion, and to all the executives who participated. We also appreciate Rob Proctor (Vave) for leading the case discussion, and Todd Rissel and Skip Coan from e2Value for sharing perspectives from the vendor side of the market.


Continue the Conversation



bottom of page