1. 05eee61 x86/kvm: Add "nopvspin" parameter to disable PV spinlocks by Zhenzhong Duan · 5 years ago
  2. 5709712 locking/qspinlock: Fix inaccessible URL of MCS lock paper by Waiman Long · 5 years ago
  3. c942fdd treewide: Replace GPLv2 boilerplate/reference with SPDX - rule 157 by Thomas Gleixner · 6 years ago
  4. ad53fa1 locking/qspinlock_stat: Introduce generic lockevent_*() counting APIs by Waiman Long · 6 years ago
  5. 733000c locking/qspinlock: Remove unnecessary BUG_ON() call by Waiman Long · 6 years ago
  6. 412f34a locking/qspinlock_stat: Track the no MCS node available case by Waiman Long · 6 years ago
  7. d682b59 locking/qspinlock: Handle > 4 slowpath nesting levels by Waiman Long · 6 years ago
  8. 0fa809c locking/pvqspinlock: Extend node size when pvqspinlock is configured by Waiman Long · 6 years ago
  9. 1222109 locking/qspinlock_stat: Count instances of nested lock slowpaths by Waiman Long · 6 years ago
  10. 7aa54be locking/qspinlock, x86: Provide liveness guarantee by Peter Zijlstra · 6 years ago
  11. 756b1df locking/qspinlock: Rework some comments by Peter Zijlstra · 6 years ago
  12. 53bf57f locking/qspinlock: Re-order code by Peter Zijlstra · 6 years ago
  13. 81d3dc9 locking/qspinlock: Add stat tracking for pending vs. slowpath by Waiman Long · 7 years ago
  14. ae75d90 locking/qspinlock: Use try_cmpxchg() instead of cmpxchg() when locking by Will Deacon · 7 years ago
  15. 9d4646d locking/qspinlock: Elide back-to-back RELEASE operations with smp_wmb() by Will Deacon · 7 years ago
  16. c131a19 locking/qspinlock: Use smp_cond_load_relaxed() to wait for next node by Will Deacon · 7 years ago
  17. f9c811fa locking/qspinlock: Use atomic_cond_read_acquire() by Will Deacon · 7 years ago
  18. c61da58 locking/qspinlock: Kill cmpxchg() loop when claiming lock from head of queue by Will Deacon · 7 years ago
  19. 59fb586 locking/qspinlock: Remove unbounded cmpxchg() loop from locking slowpath by Will Deacon · 7 years ago
  20. 6512276 locking/qspinlock: Bound spinning on pending->locked transition in slowpath by Will Deacon · 7 years ago
  21. 625e88b locking/qspinlock: Merge 'struct __qspinlock' into 'struct qspinlock' by Will Deacon · 7 years ago
  22. 11dc132 locking/qspinlock: Ensure node->count is updated before initialising node by Will Deacon · 7 years ago
  23. 95bcade locking/qspinlock: Ensure node is initialised before updating prev->next by Will Deacon · 7 years ago
  24. 548095d locking: Remove smp_read_barrier_depends() from queued_spin_lock_slowpath() by Paul E. McKenney · 7 years ago
  25. d3a024a locking: Remove spin_unlock_wait() generic definitions by Paul E. McKenney · 8 years ago
  26. 5671360 locking/qspinlock: Explicitly include asm/prefetch.h by Stafford Horne · 8 years ago
  27. 0dceeaf locking/qspinlock: Use __this_cpu_dec() instead of full-blown this_cpu_dec() by Pan Xinhui · 9 years ago
  28. 33ac279 locking/barriers: Introduce smp_acquire__after_ctrl_dep() by Peter Zijlstra · 9 years ago
  29. 1f03e8d locking/barriers: Replace smp_cond_acquire() with smp_cond_load_acquire() by Peter Zijlstra · 9 years ago
  30. 055ce0f locking/qspinlock: Add comments by Peter Zijlstra · 9 years ago
  31. 8d53fa1 locking/qspinlock: Clarify xchg_tail() ordering by Peter Zijlstra · 9 years ago
  32. 2c61002 locking/qspinlock: Fix spin_unlock_wait() some more by Peter Zijlstra · 9 years ago
  33. cb037fd locking/qspinlock: Use smp_cond_acquire() in pending code by Waiman Long · 9 years ago
  34. cd0272f locking/pvqspinlock: Queue node adaptive spinning by Waiman Long · 9 years ago
  35. 1c4941f locking/pvqspinlock: Allow limited lock stealing by Waiman Long · 9 years ago
  36. b3e0b1b locking, sched: Introduce smp_cond_acquire() and use it by Peter Zijlstra · 9 years ago
  37. aa68744 locking/qspinlock: Avoid redundant read of next pointer by Waiman Long · 9 years ago
  38. 81b5598 locking/qspinlock: Prefetch the next node cacheline by Waiman Long · 9 years ago
  39. 64d816c locking/qspinlock: Use _acquire/_release() versions of cmpxchg() & xchg() by Waiman Long · 9 years ago
  40. 43b3f02 locking/qspinlock/x86: Fix performance regression under unaccelerated VMs by Peter Zijlstra · 9 years ago
  41. 75d2270 locking/pvqspinlock: Only kick CPU at unlock time by Waiman Long · 10 years ago
  42. a23db28 locking/pvqspinlock: Implement simple paravirt support for the qspinlock by Waiman Long · 10 years ago
  43. 2aa79af locking/qspinlock: Revert to test-and-set on hypervisors by Peter Zijlstra (Intel) · 10 years ago
  44. 2c83e8e locking/qspinlock: Use a simple write to grab the lock by Waiman Long · 10 years ago
  45. 69f9cae locking/qspinlock: Optimize for smaller NR_CPUS by Peter Zijlstra (Intel) · 10 years ago
  46. 6403bd7 locking/qspinlock: Extract out code snippets for the next patch by Waiman Long · 10 years ago
  47. c1fb159 locking/qspinlock: Add pending bit by Peter Zijlstra (Intel) · 10 years ago
  48. a33fda3 locking/qspinlock: Introduce a simple generic 4-byte queued spinlock by Waiman Long · 10 years ago