Category: Development & Packaging (page 2 of 15)

All articles in this category are related to engineering teams in the Fedora Project, in particular teams working on packaging and release engineering. https://fedoraproject.org/wiki/Development

F35 retrospective results

After the release of Fedora Linux 35, I conducted a retrospective survey. I wanted to see how contributors felt about the release process and identify possible areas we can improve. There was no particular reason to start this with F35 except for that’s when I got around to doing it. So how did F35 go? Let’s look at the results from the 63 responses.

Continue reading

Looking at Fedora Linux 33 bugs

At Nest, I delivered a talk called “Exploring Our Bugs“. But a single snapshot isn’t very useful. Building on the work there, let’s make this a regular thing. With the recent Fedora Linux 33 end-of-life, I’ve added F33 bugs to the bug exploration notebook. Here’s a few of my key findings.

Trends

After a drop in bug reports in F32, F33 had about as many bug reports as F31. This is reflected in both bugs marked as duplicate and non-duplicate bugs.

Duplicate bug reports by release

Bug reports coming from abrt recovered to roughly the historical average after a surprisingly low F32.

Sources (abrt or non-abrt) of bug reports by release

We fixed roughly the same amount of F33 bugs as in the last few releases. But with the increase in overall bugs, that means we left more unfixed bugs this time around. The dramatic increase in bugs closed EOL reflects this.

Bug reports by closure type each release
Percentage of bug reports closed end-of-life by release

The good news is that we are getting faster at fixing the bug reports that we do fix.

Mean and median time to “happy” bug report resolution by release.

New graphs!

I re-downloaded the historical data to add some additional fields. This allowed me to take a look at a few areas we hadn’t examined previously.

Security

The first area I wanted to look at is the number of bugs tagged as being security-related. Fedora Linux 33 had the highest count of security bugs, with over 1200. Looking at the graph, there’s a big jump between F26 and F27. This suggests a process change. I’ll have to check with Red Hat’s product security team to see if they have an explanation.

Security bugs by release

The good news is that we’re fixing more security bugs than we’re not. The bad news is that the proportion of security bugs going unfixed is increasing. To be more correct, more bug reports are not marked as fixed. Security fixes often come in upstream releases that aren’t specifically tied to a Bugzilla bug.

Fixed and unfixed security bugs by release

Like with other bug reports, we’re fixing security bugs fixed faster than in the past. 50% of security bugs are resolved within about two weeks.

Mean and median time to resolution for security bugs by release

QA processes

I also wanted to look at how our QA processes are reflected in the bugs. During discussion of an F35 blocker candidate, Adam Williamson commented that it seemed like we were being looser in our interpretation of release criteria lately. In other words, bugs that would not have been blockers in the past are now. The numbers bear this out. While the number of both accepted and rejected blockers is down significantly from F19, there’s a general upward trend in accepted blockers from F30.

Accepted and rejected blockers by release

We have a big increase in accepted freeze exceptions recently. In fact, it looks exponential. Interestingly, the number of rejected freeze exceptions are roughly the same in that time.

Accepted and rejected freeze exceptions by release

Finally, I was curious to see if our use of the common bugs mechanism has changed over time. It has: we mark far fewer bugs compared to five+ years ago. I will be interested to see if the experiment that uses Ask Fedora to handle common issues changes the trends at all. We’ll have to wait until May 2023.

Number of bugs tagged as a common bug per release

#analysis

The graphs are pretty, but what do they mean? We have to be careful to draw too deep of conclusions. What’s in Bugzilla represents bug reports, not necessarily bugs. Some reports aren’t actual bugs and some bugs don’t have reports. And there’s a lot of “why” that we can’t pull from a summary analysis.

That said, it’s clear that we’re getting more bug reports than we can handle. Some of these should properly be filed upstream. How can we improve on the rest? We can’t do it all at once, but perhaps by working on some subset, we can make improvements. The one that jumps out to me is the security bugs. Can we bring more attention to those? I’ll spend the holiday break thinking about how to make them more visible so that they’re fixed or handled more quickly.

In the meantime, I’d love to hear your ideas, too. If you’d like to examine the data for yourself, everything is in the fedora-bug-data repo.

tmt hint 02: under the hood

After making the first steps with tmt and investigating the provisioning options let’s now dive together a little bit more and look Under The Hood to see how plans, tests and stories work together on a couple of examples.

Continue reading

CPE to staff EPEL work

We are pleased to announce that Red Hat is establishing a small team directly responsible for participating in EPEL activities. Their job isn’t to displace the EPEL community, but rather to support it full-time. We expect many beneficial effects, among those better EPEL readiness for a RHEL major release. The EPEL team will be part of the wider Community Platform Engineering group, or CPE for short.

As a reminder, CPE is the Red Hat team combining IT and release engineering from Fedora and CentOS.
Right now we are staffing up the team and expect to see us begin this work from October 2021. Keep an eye on the EPEL mailing list and the associated tracker as we begin this exciting journey with the EPEL community.

Exploring our bugs, part 3: time to resolution

This is the third and final part of a series I promised during my Nest With Fedora talk (also called “Exploring Our Bugs”). In this post, I’ll analyze the time it takes to resolve bug reports from Fedora Linux 19 to Fedora Linux 32. If you want to do your own analysis, the Jupyter notebook and source data are available on Pagure. These posts are not written to advocate any specific changes or policies. In fact, they may ask more questions than they answer.

Continue reading

Exploring our bugs, part 2: resolution

This is the second part of a series I promised during my Nest With Fedora talk (also called “Exploring Our Bugs”). In this post, I’ll be analyzing the bug report resolutions from Fedora Linux 19 to Fedora Linux 32. If you want to do your own analysis, the Jupyter notebook and source data are available on Pagure. These posts are not written to advocate any specific changes or policies. In fact, they may ask more questions than they answer.

Continue reading

Exploring our bugs, part 1: the basics

This is this first part of a series I promised during my Nest With Fedora talk (also called “Exploring Our Bugs”). In this post, I’ll review some of the basic statistics from analyzing bugs from Fedora Linux 19 to Fedora Linux 32. If you want to do your own analysis, the Jupyter notebook and source data are available on Pagure. These posts are not written to advocate any specific changes or policies. In fact, they may ask more questions than they answer. This first post looks at some basic information, including counts, priorities, and duplicates.

Continue reading

Source-git SIG report #1

Greetings from the Fedora source-git SIG! We are planning to start publishing reports of what we are working on so everyone can easily pay attention and get involved if interested. If you have any ideas, comments or requests, don’t be shy and let us know 🙂

Here’s a short list of things which we are working on.

Continue reading

Migrating the DNF Stack CI to GitHub Actions

DNF’s continuous integration (CI) has historically struggled from multiple standpoints, including: reliability, coverage, and results not being publicly available. We recently migrated to GitHub Actions, which—in addition to increasing our integration test suite stability and coverage—led to it being more reliable and its results available publicly to contributors.

Continue reading

tmt hint 01: provisioning options

After the initial hint describing the very first steps with tmt, let’s have a look at the available test execution options. Recall the user story from the very beginning:

As a tester or developer, I want to easily run tests in my preferred environment.

Do you want to safely run tests without breaking your laptop? Use the default provision method virtual which will execute tests under a virtual machine using libvirt with the help of testcloud:

Continue reading
Olderposts Newerposts

Copyright © 2024 Fedora Community Blog

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Theme by Anders NorenUp ↑