2024 Digital Government Content Survey


Each year, before we open up our Call for Proposals, we survey public servants around the world to understand the policies, technologies, and obstacles that are top of mind for them. After several years, we’re also able to chart how their priorities have shifted.

This survey was conducted using Google Forms between February and April 2024, and shared online using newsletters and social media. Selected topics were cross-tabulated to explore correlations, and some open text responses were summarized using the OpenAI ChatGPT and Google Gemini Large Language Models.


About the respondents

We began by asking respondents some basic demographic information.


Where do you live?

While the vast majority of respondents live in Canada, a decreasing number come from the National Capital Region of Ottawa/Gatineau. 

Unnamed 2


What level of education have you completed?

The number of respondents with a trade/college background is increasing, up from 10% in 2023 to 15% in 2024.

Unnamed 3


What vertical or sector are you in?

We had to process responses to this question, because many respondents seemed unclear what a “vertical” meant. For example, we received responses such as “HR and technology” (a role), “State of Florida” (an employer), or “Veterans Affairs” (a department.) As a result, the categories below are somewhat subjective, based on our best understanding of the vertical the respondent intended.

Unnamed 4

As expected given the content of the conference, most respondents work in IT and software.


What sort of organization do you work for?

Only 63.5% of respondents work in the public sector for a federal/national government. This reflects a trend we’re seeing in attendance and subscribers as well: Increased interest in digital government from all levels of government, as well as civil society in general.

Unnamed 5


Do you hold an executive role?

11% of respondents are executives. This is a smaller percentage than those who attend FWD50 in person, given the 2023 addition of the Executive Cohort. Whether it reflects executives’ busy schedules, or reluctance to respond publicly, is not clear.

Unnamed 6


What’s your primary skill or role?

This question is an attempt to identify what sort of work each respondent performs. While we provide a default list of roles, the results have to be processed because we include an “other” field. Several responses meant the same thing but were worded differently (for example “Strategy”, “Strategy Design and Management”, and “Strategy, Delivery Design” all represent strategy work.) The categorization here is our best attempt to identify what the respondent intended.

Unnamed 7


How many years have you attended FWD50?

This is a new question for 2024. Conferences can often become “preaching to the choir,” and one of the downsides of a thriving community is in-group jargon and belief systems that may not accurately reflect the state of the world. We analyzed some of the more contentious responses against the number of years respondents had participated to see if there was a relationship between “the choir” and their responses.

Here’s how many years respondents have attended:

Unnamed 8

Here’s the cumulative view of attendees (how many of the respondents have attended a particular year.)

Unnamed 9


Policy priorities

We ask respondents about policy priorities, their interest in various technologies, and what they believe are the major obstacles to delivering effective digital government. Our list of policies hasn’t changed significantly over the years, but we’ve added digital resiliency (2019), tackling disinformation and enforcing laws (2021), and Chat-based AI (2023). This allows us to see how priorities shift, often in response to world events like pandemics, widespread AI adoption, or a loss of trust in government.


Relative policy importance

We ask respondents to rank policies from “unimportant” to “critically important”, and suggest that they be critical and not simply rank everything as important. Nevertheless, these are all vital policies for government, so we compare the difference from the average to understand which policies are, relatively speaking, more or less important.

Unnamed 10

In 2024, digital resiliency, repairing transportation, tackling fake news, and providing services for the most vulnerable ranked highest; elder care, reducing healthcare costs, and maintaining secure borders ranked lowest.


Comparing policy changes over time

We can see how these priorities have shifted over past years

Unnamed 11

Since those lines are hard to parse, we’ve broken out each policy’s relative importance using a heat chart (darker green represents a higher priority) and sparkline (a small relative graph.)

Unnamed 12

Notably, ensuring digital rights had been a consistently high priority that dropped in 2024, while enforcing laws and providing a space for free speech were both consistently low priority but are higher this year. Large Language Models that power AI chatbots were below average in 2023’s survey, but after a year they’re now above average.


Checking the choir

We wondered whether those who’d attended FWD50 in the past were more likely to support certain policies. We analyzed each policy against the number of years a respondent had attended and found little correlation. For example, here’s the analysis of how many years someone has participated and how important they believe digital resilience is as a policy.

Unnamed 13


What policies did we miss?

Surveys contain both quantitative and qualitative data. A number, a yes/no answer, or making respondents choose from a pre-defined list yields quantitative data, which is much easier to summarize with graphs, percentiles, and statistics. Unfortunately, by limiting responses to just a set of choices, we’re inevitably “leading the witness.” One respondent spared no words, telling us, “there’s some pretty baked in bias in some of your survey questions. Have you ever considered having them reviewed and made a bit more neutral? Less leading?”

Summarizing open-ended, qualitative feedback is harder, but it’s also where we can find novel insights. In past years, we used word clouds to visualize the most commonly occurring text in open-ended results, which offers an imperfect glimpse into open-ended feedback (and, frankly, makes great visuals for social media.) 

Today, we have new tools that can help. Large Language Models excel at summarization and synthesis, so this year we fed qualitative results into several LLMs including ChatGPT and Google’s Gemini. To identify AI-generated text, we’ve used a different color.


What policy did we miss?

Since we can’t list every policy priority, we asked respondents what we missed. 53% of respondents suggested an addition. Our LLM helpers boiled these down into five commonly recurring themes.

The list of missing policies provided by respondents encompasses a wide range of topics, including affordability, digital governance, cybersecurity, electoral reform, emergency response, immigration, mental health, organizational health, public engagement, reconciliation with indigenous communities, rural policy, service delivery, tax reduction, truth and reconciliation, and youth engagement, among others. These responses highlight the complexity and diversity of issues that citizens consider important for a digital government to address.

Based on the feedback, here are five additional policy options to consider for future surveys:

  1. Affordable Housing and Infrastructure Development
  2. Cybersecurity and Online Safety
  3. Digital Governance and Data Management
  4. Electoral Reform and Democratic Participation
  5. Mental Health Support and Wellness Initiatives

This is a good list! Adding to our list of policies helps us evolve our topics, but has the disadvantage of making it impossible to do year-on-year comparisons that show how technologies, policies, and challenges are changing from year to year. It also results in ambiguity: “Affordable housing” can fall under “providing services for the most vulnerable,” and “Cybersecurity and online safety” is part of “digital preparedness & resiliency.”

We’ll bring this feedback to next year’s survey. If you’re proposing a talk for 2024, consider tackling one of these themes.



The survey now turned to the technology stack. As with policies, we look not only at the relative importance of technologies in 2024, but how this has shifted over the years.


Which technologies are most relevant to government?

We asked, “In the list of technologies below, rate how important you think they will be to digital government and technology transformation.”

We show these results as a heat chart: A grey bar reflects the number of respondents who felt the technology was unimportant, while a red bar shows how many feel it is of the highest importance. This format makes it easier to understand the spread of responses for a particular technology, as well as whether answers are polarized.

Unnamed 14

Some observations:

  • Blockchain and the Internet of Things scored poorly.
  • Chatbots and conversational UX scored relatively poorly, even though AI and Data Science—the technology that powers those—rated of high importance.
  • Open data and cloud computing both rated relatively highly. Both of these are relatively mainstream in most private sector and national governments, which suggests that tech adoption among respondents lags the rest of the world.
  • Assistive devices and accessibility are an important technology that isn’t well represented in much of the content we’ve had on stage.


Comparing technology changes over time

We looked at these technologies over time. As an absolute measure, the importance of technologies has decreased, which may indicate a shift as respondents realize that technology is the tool we use to solve public sector problems, but not the way in which we solve them. There is increased recognition at FWD50 and in literature that tech is not a panacea, and culture, structure, and incentives play a much more important role in successful service delivery.

Unnamed 15

Since it’s hard to understand the individual lines, we also split the technology responses into a heat chart and sparklines to better understand individual historical trends. Two technologies that were added to the 2023 survey—fast broadband and no-code tools—don’t have a full history.

Unnamed 16

Technology priorities haven’t changed much over the years, with a couple of exceptions:

  • 5G and broadband is less important, although we only have two years’ data and “two data points is not a trend.”
  • Digital signing and process management has dropped in importance in recent years.
  • Blockchain/digital ledgers and the Internet of Things have always scored relatively poorly, but they have also decreased in importance overall.


Structure, culture, and incentives

The content at FWD50 has shifted from tech-centric in the early years to an emphasis on delivery, implementation, and navigating bureaucracy. In addition to policies and technologies, we look at the mindset and obstacles public servants face when delivering digital services.

This has come under real scrutiny in 2024. While digital promises more services to more people for less money, Canada—which represents the bulk of the FWD50 audience—has fallen from 3rd to 32nd in the world in the past decade according to the UN.


A digital literacy test

One of the most controversial conversations in recent months was about whether government workers need to be digitally literate. We covered this in a blog post, and subsequent survey on social media and in our newsletter, we heard a relatively resounding yes (although the results varied by platform.)


% yes

% no







X (Twitter)













We asked the same question in our survey, with the following context:

On September 5, 2023, Berlin CDO Martina Klement announced the city’s “Digital Competence Check” which includes an anonymous self-assessment and knowledge test, based in part on the European Union’s Digital Competence Framework. (link in German)

If Government must be digitally fluent because it trafficks primarily in information, and only digital services can scale cost-effectively while meeting the needs of a widely distributed population, should Canada’s Federal Government create a Digital Competence Check that’s enforced similarly to the Second Language Evaluation?

Unnamed 17

71% of respondents felt that a certification was necessary. Those who disagreed felt strongly, worrying that any sort of technical criteria would be ageist or exclusionary.

“I am strongly against a digital literacy test for government employees,” said one respondent, saying that this risks “creating data castes in the public service” that would “have negative outcomes for the more vulnerable in the service.” They predicted that a digital certification requirement might “reduce diversity in thought and approach by limiting hiring to only those with backgrounds in economics, statistics, sociology, mathematics and others similar streams.”

Another respondent said we “can't expect public servants who aren’t given time or money to learn to pass a digital literacy test” and we should instead “invest in upskilling them and change the incentives to upskill first rather than playing police.” (The Berlin example included training time and budget to ensure employees were digitally literate.)


Comparing Executive and non-executive responses

We found that more senior respondents—those in executive positions—were more opposed to digital certifications

Unnamed 18

One respondent suggested that “it would be good to get some less digitally focused senior managers as they are one of the biggest blockers to making government digital” because “they make decisions without understanding the consequences.”


Checking the choir

We were curious whether being a supporter of digital certification correlated with the number of years people had attended.

Unnamed 19

The answer was yes: Those who had never attended were over 15% more likely to oppose a digital certification program.


Why can’t government deliver tech?

Everyone understands that modernizing the public sector is hard. We listed six of the most common obstacles, and asked respondents “what are the main reasons government can’t deliver tech at the pace citizens demand?” The following heat chart displays the results.

Unnamed 20

Outdated processes were the most widely cited cause; access to modern tools and an inability to hire technology workers were the causes respondents felt were least to blame.


Comparing Executive and non-executive responses

We compared how executive and non-executive respondents ranked these obstacles.

Unnamed 21

Executives were much less likely to pin the blame on risk tolerance, modern tools, or an inability to experiment. They also felt that management’s lack of digital smarts was the culprit in failures to deliver.


How would you invest 10% of salaries?

We gave respondents a list of seven ways they could spend 10% of each employee’s salary, asking them to choose one. Options included structural improvements; tool modernization; training, pay increases; severance pay for under-performing workers; better benefits; and hiring more people.

Unnamed 22

Since some of these suggestions (modern tools, more salary) might address the importance respondents assigned to challenges (lack of modern tools, inability to hire) we analyzed the results to see if participants were consistent, matching their concerns to solutions. The results were mixed:

  • Those who felt a lack of modern tools were to blame voted to spend more money on tools.
  • Those who rated an inability to hire as a major challenge did not support additional salary or benefits—instead, they overwhelmingly endorsed better management and organizational structure.

This suggests that salary and benefits aren’t the problem with recruiting technical public servants; rather, it’s revamping management structure and investing in digital training that will make a difference.

Respondents offered other uses for the 10% salary: “a travel budget to encourage in-person team meetings at least once per year.” said one. “My team is distributed across the country and it's tough when the annual team retreat gets cut for budget reasons. I don't know if I will ever meet my colleagues.”

Another said they’d invest it in discretionary spending for employee well-being, “freeing a part of my employees' time for tinkering, self-training, innovating, reading work-relevant content as e-book, paper books, web pages, gamification of learning, and other similar ideas, with the goal of mustering more team spirit, collaboration between team members, and fidelity to the employer.”


What one structural change would you make?

We asked, “what one structural change to government would produce the greatest improvement to service delivery?” We received many different responses, and here’s how an AI summarized them:

Streamlining Processes and Reducing Bureaucracy: Many respondents emphasize the need for simpler rules, reduced red tape, and a decrease in hierarchical structures within government departments. They advocate for decentralization of processes, flattening hierarchies, and pushing decision-making authority down to lower levels to enable faster and more efficient decision-making processes.

Empowering Employees and Teams: Respondents express a desire for greater autonomy and empowerment for employees and teams, emphasizing the importance of giving space for innovation and important work to be done without suffocating bureaucracy and excessive oversight. They advocate for a shift towards remote work by default, allowing technical experts to drive projects, and involving end users in decision-making processes.

Modernizing Technology and Approaches: There is a strong emphasis on the need to embrace modern technologies, tools, and approaches within government processes. Respondents highlight the importance of reducing barriers to adopting new technologies, improving digital literacy, and implementing agile methodologies to drive digital transformation efforts.

Enhancing Service Delivery and User Experience: Many respondents stress the importance of focusing on improving service delivery and user experience. They advocate for multidisciplinary teams, mandatory service standards, and greater collaboration across departments to ensure that government services are user-centered and accessible in a single self-service location.

Promoting Innovation and Risk-Taking: Respondents emphasize the need for a culture of innovation and risk-taking within government. They call for incentives for process improvements, rewarding failures and lessons learned, and creating innovation hubs or labs within government agencies to foster experimentation and creativity.

Overall, these themes underline a strong desire for government organizations to become more agile, user-centric and technology-driven in order to meet the evolving needs of citizens and improve service delivery.



Respondents are split about digital skills: Many believe they’re mandatory across all levels of government, but some worry they’ll divide the public sector and eliminate creativity and humanity. Employees are tired, wanting to invest in well-being and mental health. They want to change the structure and culture of the public service to attract new talent.

Everyone’s concerned about the use of generative AI to transform the public service, but worried that it will destroy jobs or create more problems than it solves. Structure and culture outweigh technological solutions, and government adoption of well-understood tools like cloud computing and open source lags that of the private sector.

Respondents were more concerned with “reinforcing” government and the rule of law this year than in past years, across enforcement, fighting disinformation, and making systems less vulnerable.

Taken as a whole, the survey offers good guidance on what kinds of talks, workshops, and interactive sessions will be most useful for our community this year, and what controversies might be good for debates and public discussion.


Published April 18, 2024

Want to download a copy of this report?

Notice at collection Your Privacy Choices