Marcel de Pontbriand is EcoMap’s Lead Data Engineer and Data Scientist. Recognized by Technical.ly as a 2025 RealList Innovator, Marcel has spent nearly five years contributing to the technical foundation that enables EcoMap to deliver ecosystem intelligence at scale.
Before EcoMap, Marcel managed clinical trial data at the University of Maryland School of Medicine and served as an AmeriCorps team supervisor at Civic Works, work that shaped how he thinks about the communities EcoMap serves.
In this conversation, he discusses leading data engineering teams, applying AI to ecosystem challenges, and building systems that help communities coordinate and grow.
- Your team is responsible for EcoMap’s data infrastructure. What does building reliable data systems for ecosystem intelligence require, and how do you approach that work?
-
Ecosystem intelligence is really about understanding what’s happening at various levels of your ecosystem and how different parts interact. That includes the organizations and resources supporting it, the entrepreneurs participating in it, and all the players in between.
To answer those questions well, we need data infrastructure that’s flexible, reliable, and consistent. We build these systems around what I’d call a lowest-common-denominator in structure and granularity. That approach allows us to work with a new customer with a slightly more novel use case and quickly transform and enrich the data to meet their needs.
Any data point we collect can answer a wide variety of questions. We may not always know exactly what our customers or admins are hoping to find, but keeping things flexible lets us pivot quickly and reshape the data to work with them.
- AI is transforming how ecosystem data gets processed and delivered. Where do you see the biggest opportunities for AI to change how economic developers and ecosystem builders do their work?
-
AI is an incredible tool, but if it’s not used well, it can lead to some unexpected outcomes. I think of data as this deep ocean of information. Prior to AI, researchers and analysts could snorkel their way down or put on some scuba gear and go into the data to start understanding what’s happening. But there are these deep, dark depths of data that were very challenging to reach.
AI has increased the ability to dive into that darkness where sunlight isn’t penetrating, so we can see what’s happening at those various levels. Humans are incredible at finding patterns in data and in nature, and AI can accelerate that. It’s an extension of human curiosity that helps pull out patterns quicker and allows people to start acting on them faster. Instead of hunting for signals and patterns, they can chase leads and make decisions.
- Data quality determines whether leaders can make confident decisions. How do you approach ensuring the data EcoMap delivers is something ecosystem builders can rely on?
-
Consistency matters here. Setting a precedent for how our data is represented is helpful, but we also recognize that our customers have different concerns with the data. Some are more focused on the shape of their ecosystem and what it looks like. Others care more about how end users are engaging with it.
Not everyone understands or prefers how we’ve organized the data, whether through keywords or hierarchy. We’ve kept it consistent, but we’ve also made some of our products more flexible when it comes to renaming and relabeling things in their own terms.
With a lot of our newer products, we’re using semantic search and natural language processing tools. That allows us to retain our core ontological structures while keeping things very accessible for people. We meet them wherever they are, especially for the end user.
Entrepreneurs are trying to figure out what to do, and they might not know all the language. Even within ecosystems, the language we use in biotech might not work the same way in agtech or general small-business entrepreneurship. Our systems are flexible when someone searches for something like “project backing.” We can rake through all of our data and tagging to find things like community funding programs or small business grants. The meaning surfaces through questions and inquiries, and we still present the resources or organizations they’re looking for.
- You’ve been on EcoMap’s data engineering efforts for nearly five years. What has that evolution looked like, and what can the platform enable now that wasn’t possible when you started?
-
Early on, the role felt a lot like being a data custodian or data plumber. A lot of starting roles feel that way. The data is inconsistent, there are gaps, and you need to make sure it’s getting from point A to point B. Pipelining is important, but our systems were accurate and slow. They had to run one after another. It took about 50 minutes for any one of our original EcoMap products to be synced and updated, and we couldn’t update anything else while that was happening. If we had 20 EcoMaps to update, that was nearly a day’s worth of time running back-to-back.
We’ve grown the sophistication of these tools significantly. We now update nearly 100 EcoMaps nightly. Each one takes just seconds, and it happens at the same time. We’ve decoupled the manual aspect and the time constraints. We can set up these large systems to run in tandem, asynchronously, on their own schedules. That frees us up to spend time on what comes after, like how people are interacting with the data or looking toward new projects.
We’ve been able to get away from the tedious aspects of connecting data from A to B and allow ourselves to scale rapidly. We now have pipelines and systems that can service multiple products with very little spin-up time. With ecosystem intelligence coming in a new way for products like the State Scorecard, there are some new processes we’ve had to put in place, but it’s become a seamless integration with our existing tools. It’s modular. We can start plugging and playing with new data sources or new products. There’s a bit of spin-up time, but once it’s set, we can move on to refining other tools or pulling insights.
- EcoMap serves small ESOs with lean teams and large state agencies with complex needs. How do you think about building data systems that deliver value across that spectrum?
-
A lot of my thinking ties back to flexibility, reliability, and consistency. Whether you’re a small to mid-sized city that wants to support entrepreneurs or an entire statewide chamber of commerce looking to get insights about the state’s ecosystem, we’re pulling from the same data sources. Being able to recognize what information we need and in what ways we can shape our data to answer those questions or outline that ecosystem becomes very manageable.
I work closely with Anna Brinley, our Director of Data and Research. She’s been a great partner in helping translate what a customer is looking for. She’ll say, “They’re trying to understand this kind of ecosystem, and their concern is on the entrepreneur side.” We work together to dial in and tailor the outgoing data or the custom questions they have to better suit those needs.
The core flexible infrastructure allows us to answer these questions. One of our more novel customers is the collectors ecosystem, which includes stamps, coins, wine, and a few others. It’s very unlike the entrepreneurship ecosystem. Entrepreneurs often come to our products because they’re trying to understand what to do. They have a lot to learn. The collectors ecosystem is different. Those folks are steeped in it. They know what’s happening, and they want to connect and act much faster. All of those products are still working off the same data. We can skip steps for customers like collectors because we don’t need to present things in such a granular, step-down way.
- Before data engineering, you worked in community health and served as an AmeriCorps team supervisor. How does that background influence the way you think about the communities EcoMap serves and the problems you’re solving?
-
Working in AmeriCorps here in Baltimore put me in parts of the city I had never really spent much time in before. I met communities and community leaders that weren’t already on my radar. One thing it taught me is that we’re all neighbors, near and far. We’re all working within the same larger institutions, whether that’s the same city, state, or country.
There were a lot of folks working toward the same things. They were looking to better their communities and seeking the same safety, assurances, and support. I think a core value that came up in my interviews with Pava (Pava LaPere, EcoMap’s forever Chief Ecosystem Officer), and something we really connected on, was making this information accessible and understandable.
Our goal as a company is to continue that. We want to aggregate information and make it easily findable and easily understandable, so people aren’t spending their time spinning their wheels. They can get the information and resources they’re looking for and start running with that.
On the community side, it’s also about finding people who are doing that work. It can be easy to reinvent the wheel when you don’t know someone down the road has already invented it. By keeping things visible, there’s a higher degree of connectivity and recognition in what’s happening. People can jump on existing projects or work alongside others who are already trying to solve those same problems.
- You believe that when communities spend locally, hire locally, and align their efforts, everyone’s position gets stronger. How does EcoMap’s data infrastructure support that vision?
-
What we’re hoping for a lot of our customers is that they’re trying to support those ecosystems. Our tools and products can allow states and communities all over to have that impact. We want to say, “We do have tools. We know people are interested in starting their own businesses, and we want to support those businesses.”
A lot of people are probably working their primary job, so they have nights and weekends or off hours to do this research and get connected. Our core products facilitate that. They don’t need to be limited by business hours, like only being able to call an office when it’s open and they’re working. They can find that information at their leisure.
The ecosystem intelligence part of this will allow us and our customers to better understand where the gaps are. People have been looking for these types of resources for these types of industries. Are they finding them? Do we even have them available? Gap analysis will become important for all of our customers in the near future, so they can better address the needs of their own communities.
We’re creating tools for end users and entrepreneurs. Then, with our intelligence suite, we’re able to give these larger entities the power to recognize their community. Things have changed a lot. How can we understand what’s happening between the census and other large government analysis tools? Those tools are dense. There’s a lot of data there, and it can be challenging to approach. We’re doing a lot of that legwork in a scalable way and giving both parties access to start making moves instead of being stuck trying to figure out even what questions to ask.
- You were recognized by Technical.ly as a 2025 RealList Innovator. When you look at where ecosystem intelligence is headed, what excites you most about what’s becoming possible?
-
Being able to dive into the granularity of this data and the realities of these communities excites me. We can explore what might seem like an edge case but might actually be a bigger reality for folks.
Ecosystems are everywhere. They’re invisible, and they’re all shapes and sizes. In some ways, they’re arbitrarily drawn around common themes. Like, a biotech ecosystem within the state of New Mexico, a Native American support ecosystem in the Southwest, or a tech-focused ecosystem in the mid-Atlantic. There are a lot of shared resources and organizations working among all of these various ecosystems.
We’re mapping more and more of all the ecosystems that can exist. With this flexible foundation, new and novel customers might come to us and say, “We really want to understand the confluence of pet health and another industry.” We can start to quickly understand the scope of their ecosystems, understand what they’re trying to ask, and start drawing those lines. Here is the ecosystem we’ve mapped before, but now we’ve organized it in a way that fits what you’re looking for.
We can start pulling that into insights and saying, “You might have all these resources, but when we look at what’s happening on the larger geographic scale, where jobs are going or not coming, or who’s working in those jobs and what the demographics in your area look like, we can start supplying feedback.” We’re not at a point where we’re making policy recommendations, but we can create tools that will allow people who do have the power to make larger decisions on their ecosystems to find those conclusions. They can start making decisions based on what they’ve seen, how it’s changed, and how we might predict it could change.
- Technical teams in this space requires balancing innovation with reliability. How do you approach that, and what does technical leadership look like when the work has direct community impact?
-
Flexibility is the biggest part of that. We’re building a lot of our tools from the ground up, but before we start building anything, we do a fair bit of research. What work has been done in these spaces? How have people tried to answer these problems before? We often find that there are things that are close and aligned, but haven’t taken our novel considerations into the product.
We look at what good patterns are to work off of, what good workflows are, and also recognize what our technical stack allows and what our current product offering allows. How can we build something that satisfies these new questions and concerns but also isn’t going to create additional friction points within our current workflows?
There are a lot of conversations between Anna (our Director of Data), Ed (our CTO), and me about what we have, what we can put on top of that, and then anything new we’re going to bring in-house. How can we ensure that’s something that can also grow and scale and be open to new customers, new questions, and even new data sources? We want to keep things growing with as little friction as possible.
A lot of research and a lot of communication. We don’t want to reinvent the wheel, but if somebody has a really good wheel design or there’s a great wheel we can work off of, that’s a jumping-off point. It allows for speed in development, which is important.
I feel very lucky that over these five years, it’s like I get paid to play with puzzles all day.