2
0
mirror of https://github.com/edk2-porting/linux-next.git synced 2024-12-22 04:03:58 +08:00
linux-next/include/asm-generic/bitops
Akinobu Mita 19de85ef57 bitops: add #ifndef for each of find bitops
The style that we normally use in asm-generic is to test the macro itself
for existence, so in asm-generic, do:

	#ifndef find_next_zero_bit_le
	extern unsigned long find_next_zero_bit_le(const void *addr,
		unsigned long size, unsigned long offset);
	#endif

and in the architectures, write

	static inline unsigned long find_next_zero_bit_le(const void *addr,
		unsigned long size, unsigned long offset)
	#define find_next_zero_bit_le find_next_zero_bit_le

This adds the #ifndef for each of the find bitops in the generic header
and source files.

Suggested-by: Arnd Bergmann <arnd@arndb.de>
Signed-off-by: Akinobu Mita <akinobu.mita@gmail.com>
Acked-by: Russell King <rmk+kernel@arm.linux.org.uk>
Cc: Martin Schwidefsky <schwidefsky@de.ibm.com>
Cc: Heiko Carstens <heiko.carstens@de.ibm.com>
Cc: Greg Ungerer <gerg@uclinux.org>
Signed-off-by: Andrew Morton <akpm@linux-foundation.org>
Signed-off-by: Linus Torvalds <torvalds@linux-foundation.org>
2011-05-26 17:12:38 -07:00
..
__ffs.h x86, generic: mark complex bitops.h inlines as __always_inline 2009-01-13 18:56:30 +01:00
__fls.h x86, generic: mark complex bitops.h inlines as __always_inline 2009-01-13 18:56:30 +01:00
arch_hweight.h arch, hweight: Fix compilation errors 2010-05-04 10:25:27 -07:00
atomic.h locking: Convert __raw_spin* functions to arch_spin* 2009-12-14 23:55:32 +01:00
const_hweight.h bitops: Optimize hweight() by making use of compile-time evaluation 2010-04-06 15:52:11 -07:00
ext2-atomic.h asm-generic: use little-endian bitops 2011-03-23 19:46:15 -07:00
ffs.h [PATCH] bitops: generic ffs() 2006-03-26 08:57:11 -08:00
ffz.h [PATCH] bitops: generic ffz() 2006-03-26 08:57:10 -08:00
find.h bitops: add #ifndef for each of find bitops 2011-05-26 17:12:38 -07:00
fls64.h x86, generic: mark complex bitops.h inlines as __always_inline 2009-01-13 18:56:30 +01:00
fls.h x86, generic: mark complex bitops.h inlines as __always_inline 2009-01-13 18:56:30 +01:00
hweight.h bitops: Optimize hweight() by making use of compile-time evaluation 2010-04-06 15:52:11 -07:00
le.h bitops: add #ifndef for each of find bitops 2011-05-26 17:12:38 -07:00
lock.h bitops: introduce lock ops 2007-10-18 14:37:29 -07:00
non-atomic.h define first set of BIT* macros 2007-10-19 11:53:42 -07:00
sched.h sched: simplify sched_find_first_bit() 2007-07-09 18:52:00 +02:00