[multiple changes]

2005-06-29  Daniel Berlin  <dberlin@dberlin.org>

	* tree-complex.c (complex_variable_components): Now a hashtable.
	(cvc_lookup): Ditto.
	(cvc_insert): Ditto.
	(create_components): Use referenced var iterator.
	Initialize hashtable.	Use cvc_insert/lookup.
	(extract_components): Use cvc_insert/lookup.
	(update_complex_components): Ditto.
	(update_complex_components_on_edge): Ditto.
	* tree-dfa.c (referenced_vars): Now a hashtable.
	(dump_referenced_vars): Use iterator.
	(referenced_var_lookup): New function.
	(referenced_var_insert): Ditto.
	(add_referenced_var): Use referenced_var_insert.
	(mark_new_vars_to_rename): Use DECL_UID.
	* tree-flow-inline.h (first_htab_element): New function.
	(end_htab_p): Ditto.
	(next_htab_element): Ditto.
	(first_referenced_var): Ditto.
	(end_referenced_vars_p): Ditto.
	(next_referenced_var): Ditto.
	(is_call_clobbered): Use DECL_UID.
	(mark_call_clobbered): Ditto.
	(clear_call_clobbered): Ditto.
	(mark_non_addressable): Ditto.
	* tree-flow.h (htab_iterator): New struct.
	(FOR_EACH_HTAB_ELEMENT): New macro.
	(struct int_tree_map): New struct.
	(int_tree_map_hash): Prototype.
	(int_tree_map_eq): Ditto.
	(referenced_var_iterator): Ditto.
	(FOR_EACH_REFERENCED_VAR): New macro.
	(referenced_vars): Now a hashtable.
	* tree-into-ssa.c (symbol_marked_for_renaming): Use DECL_UID.
	(add_new_name_mapping): Ditto.
	(mark_def_sites): Ditto.
	(insert_phi_nodes): Use referenced_var iterator.
	(mark_def_site_blocks): Ditto.
	(mark_sym_for_renaming): Use DECL_UID.
	* tree-sra.c (is_sra_candidate_decl): Use DECL_UID.
	(lookup_element): Ditto.
	(find_candidates_for_sra): Use referenced_vars iterator.
	Use DECL_UID.
	* tree-ssa-alias.c (NUM_REFERENCES): New macro.
	(NUM_REFERENCES_CLEAR): Ditto.
	(NUM_REFERENCES_INC): Ditto.
	(NUM_REFERENCES_SET): Ditto.
	(alias_obstack): New bitmap obstack.
	(struct alias_map_d): Use bitmap, not sbitmap.
	(struct alias_info): Remove num_references.
	(init_alias_info): Use referenced_var iterator.
	Initialize bitmap obstack.
	(delete_alias_info): Use referenced_var iterator.
	Free bitmap obstack.
	(compute_points_to_and_addr_escape): Use DECL_UID.
	Use new NUM_REFERENCES macros.
	(compute_flow_sensitive_aliasing): may_aliases is now a bitmap.
	Use new NUM_REFERENCES macros.
	(group_aliases_into): Update prototype to use bitmap.
	(setup_pointers_and_addressables): Use referenced_vars iterator.
	Use DECL_UID. Use new NUM_REFERENCES macros.
	(add_pointed_to_var): Use DECL_UID.
	(dump_alias_info): Use referenced_var iterator.
	(add_type_alias): Ditto.
	(used_portions): Now a hashtable.
	(used_part_map_eq): New function.
	(used_part_map_hash): Ditto.
	(free_used_part_map): Ditto.
	(up_lookup): Ditto.
	(up_insert): Ditto.
	(get_or_create_used_part_for): Use up_lookup.
	(create_overlap_variables_for): Ditto.
	(find_used_portions): Use up_insert.
	Use DECL_UID.
	(create_structure_vars): Init used_portions hashtable, use
	referenced_vars iterator.
	* tree-ssa-live.c (create_ssa_var_map): sbitmaps became bitmaps.
	Use DECL_UID.
	* tree-ssa-loop-im.c (gather_mem_refs_stmt): Use DECL_UID.
	* tree-ssa-operands.c (get_asm_expr_operands): Ditto.
	(note_addressable): Ditto.
	* tree-ssa-structalias.c (set_uids_in_ptset): Ditto.
	* tree-ssa.c (verify_flow_insensitive_alias_info): Use
	referenced_var iterator.
	Use DECL_UID.
	(delete_tree_ssa): Ditto.
	(int_tree_map_eq): New function.
	(int_tree_map_hash): Ditto.
	* tree-stdarg.c (find_va_list_reference): Use DECL_UID.
	(va_list_ptr_read): Ditto.
	(va_list_counter_struct_op): Ditto.
	(va_list_ptr_write): Ditto.
	(check_va_list_escapes): Ditto.
	(check_all_va_list_escapes): Ditto.
	(execute_optimize_stdarg): Ditto.
	* tree-tailcall.c (suitable_for_tail_opt_p): Used referenced_var
	iterator.
2005-06-30  Daniel Berlin  <dberlin@dberlin.org>

	* hashtab.h (HTAB_DELETED_ENTRY): New macro.
	(HTAB_EMPTY_ENTRY): New macro.

2005-06-30  Daniel Berlin  <dberlin@dberlin.org>

	* hashtab.c (EMPTY_ENTRY): Moved and renamed.
	(DELETED_ENTRY): Ditto.

From-SVN: r101480
This commit is contained in:
Daniel Berlin 2005-06-30 22:18:42 +00:00 committed by Daniel Berlin
parent 114a6b1d3b
commit a3648cfc0c
19 changed files with 652 additions and 294 deletions

View File

@ -1,3 +1,102 @@
2005-06-29 Daniel Berlin <dberlin@dberlin.org>
* tree-complex.c (complex_variable_components): Now a hashtable.
(cvc_lookup): Ditto.
(cvc_insert): Ditto.
(create_components): Use referenced var iterator.
Initialize hashtable. Use cvc_insert/lookup.
(extract_components): Use cvc_insert/lookup.
(update_complex_components): Ditto.
(update_complex_components_on_edge): Ditto.
* tree-dfa.c (referenced_vars): Now a hashtable.
(dump_referenced_vars): Use iterator.
(referenced_var_lookup): New function.
(referenced_var_insert): Ditto.
(add_referenced_var): Use referenced_var_insert.
(mark_new_vars_to_rename): Use DECL_UID.
* tree-flow-inline.h (first_htab_element): New function.
(end_htab_p): Ditto.
(next_htab_element): Ditto.
(first_referenced_var): Ditto.
(end_referenced_vars_p): Ditto.
(next_referenced_var): Ditto.
(is_call_clobbered): Use DECL_UID.
(mark_call_clobbered): Ditto.
(clear_call_clobbered): Ditto.
(mark_non_addressable): Ditto.
* tree-flow.h (htab_iterator): New struct.
(FOR_EACH_HTAB_ELEMENT): New macro.
(struct int_tree_map): New struct.
(int_tree_map_hash): Prototype.
(int_tree_map_eq): Ditto.
(referenced_var_iterator): Ditto.
(FOR_EACH_REFERENCED_VAR): New macro.
(referenced_vars): Now a hashtable.
* tree-into-ssa.c (symbol_marked_for_renaming): Use DECL_UID.
(add_new_name_mapping): Ditto.
(mark_def_sites): Ditto.
(insert_phi_nodes): Use referenced_var iterator.
(mark_def_site_blocks): Ditto.
(mark_sym_for_renaming): Use DECL_UID.
* tree-sra.c (is_sra_candidate_decl): Use DECL_UID.
(lookup_element): Ditto.
(find_candidates_for_sra): Use referenced_vars iterator.
Use DECL_UID.
* tree-ssa-alias.c (NUM_REFERENCES): New macro.
(NUM_REFERENCES_CLEAR): Ditto.
(NUM_REFERENCES_INC): Ditto.
(NUM_REFERENCES_SET): Ditto.
(alias_obstack): New bitmap obstack.
(struct alias_map_d): Use bitmap, not sbitmap.
(struct alias_info): Remove num_references.
(init_alias_info): Use referenced_var iterator.
Initialize bitmap obstack.
(delete_alias_info): Use referenced_var iterator.
Free bitmap obstack.
(compute_points_to_and_addr_escape): Use DECL_UID.
Use new NUM_REFERENCES macros.
(compute_flow_sensitive_aliasing): may_aliases is now a bitmap.
Use new NUM_REFERENCES macros.
(group_aliases_into): Update prototype to use bitmap.
(setup_pointers_and_addressables): Use referenced_vars iterator.
Use DECL_UID. Use new NUM_REFERENCES macros.
(add_pointed_to_var): Use DECL_UID.
(dump_alias_info): Use referenced_var iterator.
(add_type_alias): Ditto.
(used_portions): Now a hashtable.
(used_part_map_eq): New function.
(used_part_map_hash): Ditto.
(free_used_part_map): Ditto.
(up_lookup): Ditto.
(up_insert): Ditto.
(get_or_create_used_part_for): Use up_lookup.
(create_overlap_variables_for): Ditto.
(find_used_portions): Use up_insert.
Use DECL_UID.
(create_structure_vars): Init used_portions hashtable, use
referenced_vars iterator.
* tree-ssa-live.c (create_ssa_var_map): sbitmaps became bitmaps.
Use DECL_UID.
* tree-ssa-loop-im.c (gather_mem_refs_stmt): Use DECL_UID.
* tree-ssa-operands.c (get_asm_expr_operands): Ditto.
(note_addressable): Ditto.
* tree-ssa-structalias.c (set_uids_in_ptset): Ditto.
* tree-ssa.c (verify_flow_insensitive_alias_info): Use
referenced_var iterator.
Use DECL_UID.
(delete_tree_ssa): Ditto.
(int_tree_map_eq): New function.
(int_tree_map_hash): Ditto.
* tree-stdarg.c (find_va_list_reference): Use DECL_UID.
(va_list_ptr_read): Ditto.
(va_list_counter_struct_op): Ditto.
(va_list_ptr_write): Ditto.
(check_va_list_escapes): Ditto.
(check_all_va_list_escapes): Ditto.
(execute_optimize_stdarg): Ditto.
* tree-tailcall.c (suitable_for_tail_opt_p): Used referenced_var
iterator.
2005-06-30 Andrew Pinski <pinskia@physics.uc.edu>
* config/rs6000/darwin.h (FRAME_POINTER_REGNUM): Rename to ...

View File

@ -52,8 +52,37 @@ DEF_VEC_ALLOC_I(complex_lattice_t, heap);
static VEC(complex_lattice_t, heap) *complex_lattice_values;
/* For each complex variable, a pair of variables for the components. */
static VEC(tree, heap) *complex_variable_components;
/* For each complex variable, a pair of variables for the components exists in
the hashtable. */
static htab_t complex_variable_components;
/* Lookup UID in the complex_variable_components hashtable and return the
associated tree. */
static tree
cvc_lookup (unsigned int uid)
{
struct int_tree_map *h, in;
in.uid = uid;
h = htab_find_with_hash (complex_variable_components, &in, uid);
gcc_assert (h);
return h->to;
}
/* Insert the pair UID, TO into the complex_variable_components hashtable. */
static void
cvc_insert (unsigned int uid, tree to)
{
struct int_tree_map *h;
void **loc;
h = xmalloc (sizeof (struct int_tree_map));
h->uid = uid;
h->to = to;
loc = htab_find_slot_with_hash (complex_variable_components, h,
uid, INSERT);
*(struct int_tree_map **) loc = h;
}
/* Return true if T is not a zero constant. In the case of real values,
@ -355,18 +384,19 @@ complex_visit_phi (tree phi)
static void
create_components (void)
{
size_t k, n;
size_t n;
tree var;
referenced_var_iterator rvi;
n = num_referenced_vars;
if (n == 0)
return;
complex_variable_components = VEC_alloc (tree, heap, 2*n);
VEC_safe_grow (tree, heap, complex_variable_components, 2*n);
complex_variable_components = htab_create (10, int_tree_map_hash,
int_tree_map_eq, free);
for (k = 0; k < n; ++k)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (k);
tree r = NULL, i = NULL;
if (var != NULL
@ -409,8 +439,8 @@ create_components (void)
}
}
VEC_replace (tree, complex_variable_components, 2*k, r);
VEC_replace (tree, complex_variable_components, 2*k + 1, i);
cvc_insert (2 * DECL_UID (var), r);
cvc_insert (2 * DECL_UID (var) + 1, i);
}
}
@ -464,8 +494,7 @@ extract_component (block_stmt_iterator *bsi, tree t, bool imagpart_p,
}
}
return VEC_index (tree, complex_variable_components,
var_ann (SSA_NAME_VAR (t))->uid * 2 + imagpart_p);
return cvc_lookup (DECL_UID (SSA_NAME_VAR (t)) * 2 + imagpart_p);
}
default:
@ -478,16 +507,16 @@ extract_component (block_stmt_iterator *bsi, tree t, bool imagpart_p,
static void
update_complex_components (block_stmt_iterator *bsi, tree stmt, tree r, tree i)
{
unsigned int uid = var_ann (SSA_NAME_VAR (TREE_OPERAND (stmt, 0)))->uid;
unsigned int uid = DECL_UID (SSA_NAME_VAR (TREE_OPERAND (stmt, 0)));
tree v, x;
v = VEC_index (tree, complex_variable_components, 2*uid);
v = cvc_lookup (2*uid);
x = build2 (MODIFY_EXPR, TREE_TYPE (v), v, r);
SET_EXPR_LOCUS (x, EXPR_LOCUS (stmt));
TREE_BLOCK (x) = TREE_BLOCK (stmt);
bsi_insert_after (bsi, x, BSI_NEW_STMT);
v = VEC_index (tree, complex_variable_components, 2*uid + 1);
v = cvc_lookup (2*uid + 1);
x = build2 (MODIFY_EXPR, TREE_TYPE (v), v, i);
SET_EXPR_LOCUS (x, EXPR_LOCUS (stmt));
TREE_BLOCK (x) = TREE_BLOCK (stmt);
@ -497,10 +526,10 @@ update_complex_components (block_stmt_iterator *bsi, tree stmt, tree r, tree i)
static void
update_complex_components_on_edge (edge e, tree stmt, tree lhs, tree r, tree i)
{
unsigned int uid = var_ann (SSA_NAME_VAR (lhs))->uid;
unsigned int uid = DECL_UID (SSA_NAME_VAR (lhs));
tree v, x;
v = VEC_index (tree, complex_variable_components, 2*uid);
v = cvc_lookup (2*uid);
x = build2 (MODIFY_EXPR, TREE_TYPE (v), v, r);
if (stmt)
{
@ -509,7 +538,7 @@ update_complex_components_on_edge (edge e, tree stmt, tree lhs, tree r, tree i)
}
bsi_insert_on_edge (e, x);
v = VEC_index (tree, complex_variable_components, 2*uid + 1);
v = cvc_lookup (2*uid + 1);
x = build2 (MODIFY_EXPR, TREE_TYPE (v), v, i);
if (stmt)
{
@ -1406,7 +1435,9 @@ tree_lower_complex (void)
bsi_commit_edge_inserts ();
VEC_free (tree, heap, complex_variable_components);
if (complex_variable_components)
htab_delete (complex_variable_components);
VEC_free (complex_lattice_t, heap, complex_lattice_values);
}

View File

@ -83,7 +83,7 @@ static void add_referenced_var (tree, struct walk_state *);
/* Global declarations. */
/* Array of all variables referenced in the function. */
VEC(tree,gc) *referenced_vars;
htab_t referenced_vars;
/*---------------------------------------------------------------------------
@ -230,14 +230,14 @@ make_rename_temp (tree type, const char *prefix)
void
dump_referenced_vars (FILE *file)
{
size_t i;
tree var;
referenced_var_iterator rvi;
fprintf (file, "\nReferenced variables in %s: %u\n\n",
get_name (current_function_decl), (unsigned) num_referenced_vars);
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
fprintf (file, "Variable: ");
dump_variable (file, var);
fprintf (file, "\n");
@ -278,7 +278,7 @@ dump_variable (FILE *file, tree var)
ann = var_ann (var);
fprintf (file, ", UID %u", (unsigned) ann->uid);
fprintf (file, ", UID %u", (unsigned) DECL_UID (var));
fprintf (file, ", ");
print_generic_expr (file, TREE_TYPE (var), dump_flags);
@ -528,6 +528,36 @@ find_vars_r (tree *tp, int *walk_subtrees, void *data)
}
/* Lookup UID in the referenced_vars hashtable and return the associated
variable. */
tree
referenced_var_lookup (unsigned int uid)
{
struct int_tree_map *h, in;
in.uid = uid;
h = htab_find_with_hash (referenced_vars, &in, uid);
gcc_assert (h || uid == 0);
if (h)
return h->to;
return NULL_TREE;
}
/* Insert the pair UID, TO into the referenced_vars hashtable. */
static void
referenced_var_insert (unsigned int uid, tree to)
{
struct int_tree_map *h;
void **loc;
h = ggc_alloc (sizeof (struct int_tree_map));
h->uid = uid;
h->to = to;
loc = htab_find_slot_with_hash (referenced_vars, h, uid, INSERT);
*(struct int_tree_map **) loc = h;
}
/* Add VAR to the list of dereferenced variables.
WALK_STATE contains a hash table used to avoid adding the same
@ -555,8 +585,8 @@ add_referenced_var (tree var, struct walk_state *walk_state)
intrinsic to the variable. */
if (slot)
*slot = (void *) var;
v_ann->uid = num_referenced_vars;
VEC_safe_push (tree, gc, referenced_vars, var);
referenced_var_insert (DECL_UID (var), var);
/* Global variables are always call-clobbered. */
if (is_global_var (var))
@ -646,7 +676,7 @@ mark_new_vars_to_rename (tree stmt)
{
if (!DECL_P (val))
val = SSA_NAME_VAR (val);
bitmap_set_bit (vars_in_vops_to_rename, var_ann (val)->uid);
bitmap_set_bit (vars_in_vops_to_rename, DECL_UID (val));
}
/* Now force an operand re-scan on the statement and mark any newly

View File

@ -25,6 +25,87 @@ Boston, MA 02110-1301, USA. */
/* Inline functions for manipulating various data structures defined in
tree-flow.h. See tree-flow.h for documentation. */
/* Initialize the hashtable iterator HTI to point to hashtable TABLE */
static inline void *
first_htab_element (htab_iterator *hti, htab_t table)
{
hti->htab = table;
hti->slot = table->entries;
hti->limit = hti->slot + htab_size (table);
do
{
PTR x = *(hti->slot);
if (x != HTAB_EMPTY_ENTRY && x != HTAB_DELETED_ENTRY)
break;
} while (++(hti->slot) < hti->limit);
if (hti->slot < hti->limit)
return *(hti->slot);
return NULL;
}
/* Return current non-empty/deleted slot of the hashtable pointed to by HTI,
or NULL if we have reached the end. */
static inline bool
end_htab_p (htab_iterator *hti)
{
if (hti->slot >= hti->limit)
return true;
return false;
}
/* Advance the hashtable iterator pointed by HTI to the next element of the
hashtable. */
static inline void *
next_htab_element (htab_iterator *hti)
{
while (++(hti->slot) < hti->limit)
{
PTR x = *(hti->slot);
if (x != HTAB_EMPTY_ENTRY && x != HTAB_DELETED_ENTRY)
return x;
};
return NULL;
}
/* Initialize ITER to point to the first referenced variable in the
referenced_vars hashtable, and return that variable. */
static inline tree
first_referenced_var (referenced_var_iterator *iter)
{
struct int_tree_map *itm;
itm = first_htab_element (&iter->hti, referenced_vars);
if (!itm)
return NULL;
return itm->to;
}
/* Return true if we have hit the end of the referenced variables ITER is
iterating through. */
static inline bool
end_referenced_vars_p (referenced_var_iterator *iter)
{
return end_htab_p (&iter->hti);
}
/* Make ITER point to the next referenced_var in the referenced_var hashtable,
and return that variable. */
static inline tree
next_referenced_var (referenced_var_iterator *iter)
{
struct int_tree_map *itm;
itm = next_htab_element (&iter->hti);
if (!itm)
return NULL;
return itm->to;
}
/* Return the variable annotation for T, which must be a _DECL node.
Return NULL if the variable annotation doesn't already exist. */
static inline var_ann_t
@ -752,7 +833,7 @@ static inline bool
is_call_clobbered (tree var)
{
return is_global_var (var)
|| bitmap_bit_p (call_clobbered_vars, var_ann (var)->uid);
|| bitmap_bit_p (call_clobbered_vars, DECL_UID (var));
}
/* Mark variable VAR as being clobbered by function calls. */
@ -766,7 +847,7 @@ mark_call_clobbered (tree var)
location in global memory. */
if (ann->mem_tag_kind != NOT_A_TAG && ann->mem_tag_kind != STRUCT_FIELD)
DECL_EXTERNAL (var) = 1;
bitmap_set_bit (call_clobbered_vars, ann->uid);
bitmap_set_bit (call_clobbered_vars, DECL_UID (var));
ssa_call_clobbered_cache_valid = false;
ssa_ro_call_cache_valid = false;
}
@ -778,7 +859,7 @@ clear_call_clobbered (tree var)
var_ann_t ann = var_ann (var);
if (ann->mem_tag_kind != NOT_A_TAG && ann->mem_tag_kind != STRUCT_FIELD)
DECL_EXTERNAL (var) = 0;
bitmap_clear_bit (call_clobbered_vars, ann->uid);
bitmap_clear_bit (call_clobbered_vars, DECL_UID (var));
ssa_call_clobbered_cache_valid = false;
ssa_ro_call_cache_valid = false;
}
@ -787,7 +868,7 @@ clear_call_clobbered (tree var)
static inline void
mark_non_addressable (tree var)
{
bitmap_clear_bit (call_clobbered_vars, var_ann (var)->uid);
bitmap_clear_bit (call_clobbered_vars, DECL_UID (var));
TREE_ADDRESSABLE (var) = 0;
ssa_call_clobbered_cache_valid = false;
ssa_ro_call_cache_valid = false;

View File

@ -41,6 +41,20 @@ typedef struct basic_block_def *basic_block;
/* True if the code is in ssa form. */
extern bool in_ssa_p;
typedef struct
{
htab_t htab;
PTR *slot;
PTR *limit;
} htab_iterator;
/* Iterate through the elements of hashtable HTAB, using htab_iterator ITER,
storing each element in RESULT, which is of type TYPE. */
#define FOR_EACH_HTAB_ELEMENT(HTAB, RESULT, TYPE, ITER) \
for (RESULT = (TYPE) first_htab_element (&(ITER), (HTAB)); \
!end_htab_p (&(ITER)); \
RESULT = (TYPE) next_htab_element (&(ITER)))
/*---------------------------------------------------------------------------
Attributes for SSA_NAMEs.
@ -208,9 +222,6 @@ struct var_ann_d GTY(())
/* Variables that may alias this variable. */
varray_type may_aliases;
/* Unique ID of this variable. */
size_t uid;
/* Used when going out of SSA form to indicate which partition this
variable represents storage for. */
unsigned partition;
@ -356,11 +367,33 @@ static inline void set_phi_nodes (basic_block, tree);
/*---------------------------------------------------------------------------
Global declarations
---------------------------------------------------------------------------*/
/* Array of all variables referenced in the function. */
extern GTY(()) VEC(tree,gc) *referenced_vars;
struct int_tree_map GTY(())
{
#define num_referenced_vars VEC_length (tree, referenced_vars)
#define referenced_var(i) VEC_index (tree, referenced_vars, i)
unsigned int uid;
tree to;
};
extern unsigned int int_tree_map_hash (const void *);
extern int int_tree_map_eq (const void *, const void *);
typedef struct
{
htab_iterator hti;
} referenced_var_iterator;
#define FOR_EACH_REFERENCED_VAR(VAR, ITER) \
for ((VAR) = first_referenced_var (&(ITER)); \
!end_referenced_vars_p (&(ITER)); \
(VAR) = next_referenced_var (&(ITER)))
/* Array of all variables referenced in the function. */
extern GTY((param_is (struct int_tree_map))) htab_t referenced_vars;
extern tree referenced_var_lookup (unsigned int);
#define num_referenced_vars htab_elements (referenced_vars)
#define referenced_var(i) referenced_var_lookup (i)
/* Array of all SSA_NAMEs used in the function. */
extern GTY(()) VEC(tree,gc) *ssa_names;

View File

@ -454,7 +454,7 @@ static inline bool
symbol_marked_for_renaming (tree sym)
{
gcc_assert (DECL_P (sym));
return bitmap_bit_p (syms_to_rename, var_ann (sym)->uid);
return bitmap_bit_p (syms_to_rename, DECL_UID (sym));
}
@ -582,7 +582,7 @@ add_new_name_mapping (tree new, tree old)
Otherwise, the insertion of PHI nodes for each of the old
names in these mappings will be very slow. */
sym = SSA_NAME_VAR (new);
uid = var_ann (sym)->uid;
uid = DECL_UID (sym);
update_ssa_stats.num_virtual_mappings++;
if (!bitmap_bit_p (update_ssa_stats.virtual_symbols, uid))
{
@ -651,7 +651,7 @@ mark_def_sites (struct dom_walk_data *walk_data,
{
tree sym = USE_FROM_PTR (use_p);
gcc_assert (DECL_P (sym));
if (!bitmap_bit_p (kills, var_ann (sym)->uid))
if (!bitmap_bit_p (kills, DECL_UID (sym)))
set_livein_block (sym, bb);
REWRITE_THIS_STMT (stmt) = 1;
}
@ -676,7 +676,7 @@ mark_def_sites (struct dom_walk_data *walk_data,
{
gcc_assert (DECL_P (def));
set_def_block (def, bb, false);
bitmap_set_bit (kills, var_ann (def)->uid);
bitmap_set_bit (kills, DECL_UID (def));
REGISTER_DEFS_IN_THIS_STMT (stmt) = 1;
}
@ -861,15 +861,15 @@ insert_phi_nodes_for (tree var, bitmap phi_insertion_points, bool update_p)
static void
insert_phi_nodes (bitmap *dfs)
{
unsigned i;
referenced_var_iterator rvi;
tree var;
timevar_push (TV_TREE_INSERT_PHI_NODES);
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
struct def_blocks_d *def_map;
bitmap idf;
tree var = referenced_var (i);
def_map = find_def_blocks_for (var);
if (def_map == NULL)
@ -1662,16 +1662,16 @@ mark_def_sites_initialize_block (struct dom_walk_data *walk_data,
static void
mark_def_site_blocks (sbitmap interesting_blocks)
{
size_t i;
struct dom_walk_data walk_data;
struct mark_def_sites_global_data mark_def_sites_global_data;
referenced_var_iterator rvi;
tree var;
/* Allocate memory for the DEF_BLOCKS hash table. */
def_blocks = htab_create (VEC_length (tree, referenced_vars),
def_blocks = htab_create (num_referenced_vars,
def_blocks_hash, def_blocks_eq, def_blocks_free);
for (i = 0; i < num_referenced_vars; i++)
set_current_def (referenced_var (i), NULL_TREE);
FOR_EACH_REFERENCED_VAR(var, rvi)
set_current_def (var, NULL_TREE);
/* Setup callbacks for the generic dominator tree walker to find and
mark definition sites. */
@ -2287,7 +2287,7 @@ mark_sym_for_renaming (tree sym)
if (need_to_initialize_update_ssa_p)
init_update_ssa ();
bitmap_set_bit (syms_to_rename, var_ann (sym)->uid);
bitmap_set_bit (syms_to_rename, DECL_UID (sym));
if (!is_gimple_reg (sym))
need_to_update_vops_p = true;
@ -2769,7 +2769,7 @@ update_ssa (unsigned update_flags)
EXECUTE_IF_SET_IN_BITMAP (syms_to_rename, 0, i, bi)
insert_updated_phi_nodes_for (referenced_var (i), dfs, blocks,
update_flags);
update_flags);
FOR_EACH_BB (bb)
BITMAP_FREE (dfs[bb->index]);

View File

@ -151,7 +151,7 @@ static tree generate_element_ref (struct sra_elt *);
static bool
is_sra_candidate_decl (tree decl)
{
return DECL_P (decl) && bitmap_bit_p (sra_candidates, var_ann (decl)->uid);
return DECL_P (decl) && bitmap_bit_p (sra_candidates, DECL_UID (decl));
}
/* Return true if TYPE is a scalar type. */
@ -493,7 +493,7 @@ lookup_element (struct sra_elt *parent, tree child, tree type,
if (TREE_CODE (child) == PARM_DECL)
{
elt->n_copies = 1;
bitmap_set_bit (needs_copy_in, var_ann (child)->uid);
bitmap_set_bit (needs_copy_in, DECL_UID (child));
}
}
@ -940,15 +940,15 @@ sra_walk_function (const struct sra_walk_fns *fns)
static bool
find_candidates_for_sra (void)
{
size_t i;
bool any_set = false;
tree var;
referenced_var_iterator rvi;
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
if (decl_can_be_decomposed_p (var))
{
bitmap_set_bit (sra_candidates, var_ann (var)->uid);
bitmap_set_bit (sra_candidates, DECL_UID (var));
any_set = true;
}
}

View File

@ -43,6 +43,19 @@ Boston, MA 02110-1301, USA. */
#include "convert.h"
#include "params.h"
#include "vec.h"
#include "bitmap.h"
/* Keep track of how many times each pointer has been dereferenced in
the program using the aux variable. This is used by the alias
grouping heuristic in compute_flow_insensitive_aliasing. */
#define NUM_REFERENCES(ANN) ((size_t)((ANN)->common.aux))
#define NUM_REFERENCES_CLEAR(ANN) ((ANN)->common.aux) = 0
#define NUM_REFERENCES_INC(ANN) (ANN)->common.aux = (void*) (((size_t)((ANN)->common.aux)) + 1)
#define NUM_REFERENCES_SET(ANN, VAL) (ANN)->common.aux = (void*) ((void *)(VAL))
/* Obstack used to hold grouping bitmaps and other temporary bitmaps used by
aliasing */
static bitmap_obstack alias_obstack;
/* 'true' after aliases have been computed (see compute_may_aliases). */
bool aliases_computed_p;
@ -66,7 +79,7 @@ struct alias_map_d
/* Set of variables aliased with VAR. This is the exact same
information contained in VAR_ANN (VAR)->MAY_ALIASES, but in
bitmap form to speed up alias grouping. */
sbitmap may_aliases;
bitmap may_aliases;
};
@ -100,11 +113,6 @@ struct alias_info
/* Number of const/pure function calls found in the program. */
size_t num_pure_const_calls_found;
/* Array of counters to keep track of how many times each pointer has
been dereferenced in the program. This is used by the alias grouping
heuristic in compute_flow_insensitive_aliasing. */
varray_type num_references;
/* Total number of virtual operands that will be needed to represent
all the aliases of all the pointers found in the program. */
long total_alias_vops;
@ -490,16 +498,18 @@ static struct alias_info *
init_alias_info (void)
{
struct alias_info *ai;
referenced_var_iterator rvi;
tree var;
bitmap_obstack_initialize (&alias_obstack);
ai = xcalloc (1, sizeof (struct alias_info));
ai->ssa_names_visited = sbitmap_alloc (num_ssa_names);
sbitmap_zero (ai->ssa_names_visited);
VARRAY_TREE_INIT (ai->processed_ptrs, 50, "processed_ptrs");
ai->addresses_needed = BITMAP_ALLOC (NULL);
VARRAY_UINT_INIT (ai->num_references, num_referenced_vars, "num_references");
ai->written_vars = BITMAP_ALLOC (NULL);
ai->dereferenced_ptrs_store = BITMAP_ALLOC (NULL);
ai->dereferenced_ptrs_load = BITMAP_ALLOC (NULL);
ai->addresses_needed = BITMAP_ALLOC (&alias_obstack);
ai->written_vars = BITMAP_ALLOC (&alias_obstack);
ai->dereferenced_ptrs_store = BITMAP_ALLOC (&alias_obstack);
ai->dereferenced_ptrs_load = BITMAP_ALLOC (&alias_obstack);
/* If aliases have been computed before, clear existing information. */
if (aliases_computed_p)
@ -512,13 +522,13 @@ init_alias_info (void)
bitmap_clear (addressable_vars);
/* Clear flow-insensitive alias information from each symbol. */
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
var_ann_t ann = var_ann (var);
ann->is_alias_tag = 0;
ann->may_aliases = NULL;
NUM_REFERENCES_CLEAR (ann);
/* Since we are about to re-discover call-clobbered
variables, clear the call-clobbered flag. Variables that
@ -577,29 +587,32 @@ static void
delete_alias_info (struct alias_info *ai)
{
size_t i;
referenced_var_iterator rvi;
tree var;
sbitmap_free (ai->ssa_names_visited);
ai->processed_ptrs = NULL;
BITMAP_FREE (ai->addresses_needed);
for (i = 0; i < ai->num_addressable_vars; i++)
free (ai->addressable_vars[i]);
FOR_EACH_REFERENCED_VAR(var, rvi)
{
sbitmap_free (ai->addressable_vars[i]->may_aliases);
free (ai->addressable_vars[i]);
var_ann_t ann = var_ann (var);
NUM_REFERENCES_CLEAR (ann);
}
free (ai->addressable_vars);
for (i = 0; i < ai->num_pointers; i++)
{
sbitmap_free (ai->pointers[i]->may_aliases);
free (ai->pointers[i]);
}
free (ai->pointers[i]);
free (ai->pointers);
ai->num_references = NULL;
BITMAP_FREE (ai->written_vars);
BITMAP_FREE (ai->dereferenced_ptrs_store);
BITMAP_FREE (ai->dereferenced_ptrs_load);
bitmap_obstack_release (&alias_obstack);
free (ai);
}
@ -621,7 +634,6 @@ collect_points_to_info_for (struct alias_info *ai, tree ptr)
}
}
/* Traverse use-def links for all the pointers in the program to collect
address escape and points-to information.
@ -660,14 +672,15 @@ compute_points_to_and_addr_escape (struct alias_info *ai)
EXECUTE_IF_SET_IN_BITMAP (addr_taken, 0, i, bi)
{
tree var = referenced_var (i);
bitmap_set_bit (ai->addresses_needed, var_ann (var)->uid);
bitmap_set_bit (ai->addresses_needed, DECL_UID (var));
if (stmt_escapes_p)
mark_call_clobbered (var);
}
FOR_EACH_SSA_TREE_OPERAND (op, stmt, iter, SSA_OP_USE)
{
var_ann_t v_ann = var_ann (SSA_NAME_VAR (op));
tree var = SSA_NAME_VAR (op);
var_ann_t v_ann = var_ann (var);
struct ptr_info_def *pi;
bool is_store;
unsigned num_uses, num_derefs;
@ -679,8 +692,8 @@ compute_points_to_and_addr_escape (struct alias_info *ai)
because we are processing regular variables, not
memory tags (the array's initial size is set to
NUM_REFERENCED_VARS). */
if (may_be_aliased (SSA_NAME_VAR (op)))
(VARRAY_UINT (ai->num_references, v_ann->uid))++;
if (may_be_aliased (var))
NUM_REFERENCES_INC (v_ann);
if (!POINTER_TYPE_P (TREE_TYPE (op)))
continue;
@ -700,18 +713,18 @@ compute_points_to_and_addr_escape (struct alias_info *ai)
pi->is_dereferenced = 1;
/* Keep track of how many time we've dereferenced each
pointer. Again, we don't need to grow
AI->NUM_REFERENCES because we're processing
existing program variables. */
(VARRAY_UINT (ai->num_references, v_ann->uid))++;
pointer. */
NUM_REFERENCES_INC (v_ann);
/* If this is a store operation, mark OP as being
dereferenced to store, otherwise mark it as being
dereferenced to load. */
if (is_store)
bitmap_set_bit (ai->dereferenced_ptrs_store, v_ann->uid);
bitmap_set_bit (ai->dereferenced_ptrs_store,
DECL_UID (var));
else
bitmap_set_bit (ai->dereferenced_ptrs_load, v_ann->uid);
bitmap_set_bit (ai->dereferenced_ptrs_load,
DECL_UID (var));
}
if (stmt_escapes_p && num_derefs < num_uses)
@ -727,7 +740,8 @@ compute_points_to_and_addr_escape (struct alias_info *ai)
operation inside the called function. */
if (get_call_expr_in (stmt))
{
bitmap_set_bit (ai->dereferenced_ptrs_store, v_ann->uid);
bitmap_set_bit (ai->dereferenced_ptrs_store,
DECL_UID (var));
pi->is_dereferenced = 1;
}
}
@ -740,9 +754,9 @@ compute_points_to_and_addr_escape (struct alias_info *ai)
{
tree var = SSA_NAME_VAR (op);
var_ann_t ann = var_ann (var);
bitmap_set_bit (ai->written_vars, ann->uid);
bitmap_set_bit (ai->written_vars, DECL_UID (var));
if (may_be_aliased (var))
(VARRAY_UINT (ai->num_references, ann->uid))++;
NUM_REFERENCES_INC (ann);
if (POINTER_TYPE_P (TREE_TYPE (op)))
collect_points_to_info_for (ai, op);
@ -752,8 +766,7 @@ compute_points_to_and_addr_escape (struct alias_info *ai)
FOR_EACH_SSA_TREE_OPERAND (op, stmt, iter, SSA_OP_VIRTUAL_DEFS)
{
tree var = DECL_P (op) ? op : SSA_NAME_VAR (op);
var_ann_t ann = var_ann (var);
bitmap_set_bit (ai->written_vars, ann->uid);
bitmap_set_bit (ai->written_vars, DECL_UID (var));
}
/* After promoting variables and computing aliasing we will
@ -861,7 +874,6 @@ create_name_tags (struct alias_info *ai)
}
/* For every pointer P_i in AI->PROCESSED_PTRS, create may-alias sets for
the name memory tag (NMT) associated with P_i. If P_i escapes, then its
name tag and the variables it points-to are call-clobbered. Finally, if
@ -961,8 +973,7 @@ compute_flow_insensitive_aliasing (struct alias_info *ai)
var_ann_t tag_ann = var_ann (tag);
p_map->total_alias_vops = 0;
p_map->may_aliases = sbitmap_alloc (num_referenced_vars);
sbitmap_zero (p_map->may_aliases);
p_map->may_aliases = BITMAP_ALLOC (&alias_obstack);
for (j = 0; j < ai->num_addressable_vars; j++)
{
@ -985,9 +996,9 @@ compute_flow_insensitive_aliasing (struct alias_info *ai)
So we first check the call_clobbered status of the
tag and variable before querying the bitmap. */
tag_stored_p = is_call_clobbered (tag)
|| bitmap_bit_p (ai->written_vars, tag_ann->uid);
|| bitmap_bit_p (ai->written_vars, DECL_UID (tag));
var_stored_p = is_call_clobbered (var)
|| bitmap_bit_p (ai->written_vars, v_ann->uid);
|| bitmap_bit_p (ai->written_vars, DECL_UID (var));
if (!tag_stored_p && !var_stored_p)
continue;
@ -996,8 +1007,8 @@ compute_flow_insensitive_aliasing (struct alias_info *ai)
subvar_t svars;
size_t num_tag_refs, num_var_refs;
num_tag_refs = VARRAY_UINT (ai->num_references, tag_ann->uid);
num_var_refs = VARRAY_UINT (ai->num_references, v_ann->uid);
num_tag_refs = NUM_REFERENCES (tag_ann);
num_var_refs = NUM_REFERENCES (v_ann);
/* Add VAR to TAG's may-aliases set. */
@ -1013,7 +1024,7 @@ compute_flow_insensitive_aliasing (struct alias_info *ai)
add_may_alias (tag, sv->var);
/* Update the bitmap used to represent TAG's alias set
in case we need to group aliases. */
SET_BIT (p_map->may_aliases, var_ann (sv->var)->uid);
bitmap_set_bit (p_map->may_aliases, DECL_UID (sv->var));
}
}
else
@ -1021,7 +1032,7 @@ compute_flow_insensitive_aliasing (struct alias_info *ai)
add_may_alias (tag, var);
/* Update the bitmap used to represent TAG's alias set
in case we need to group aliases. */
SET_BIT (p_map->may_aliases, var_ann (var)->uid);
bitmap_set_bit (p_map->may_aliases, DECL_UID (var));
}
/* Update the total number of virtual operands due to
@ -1063,13 +1074,13 @@ compute_flow_insensitive_aliasing (struct alias_info *ai)
size_t j;
struct alias_map_d *p_map1 = ai->pointers[i];
tree tag1 = var_ann (p_map1->var)->type_mem_tag;
sbitmap may_aliases1 = p_map1->may_aliases;
bitmap may_aliases1 = p_map1->may_aliases;
for (j = i + 1; j < ai->num_pointers; j++)
{
struct alias_map_d *p_map2 = ai->pointers[j];
tree tag2 = var_ann (p_map2->var)->type_mem_tag;
sbitmap may_aliases2 = p_map2->may_aliases;
bitmap may_aliases2 = p_map2->may_aliases;
/* If the pointers may not point to each other, do nothing. */
if (!may_alias_p (p_map1->var, p_map1->set, tag2, p_map2->set))
@ -1077,26 +1088,26 @@ compute_flow_insensitive_aliasing (struct alias_info *ai)
/* The two pointers may alias each other. If they already have
symbols in common, do nothing. */
if (sbitmap_any_common_bits (may_aliases1, may_aliases2))
if (bitmap_intersect_p (may_aliases1, may_aliases2))
continue;
if (sbitmap_first_set_bit (may_aliases2) >= 0)
if (!bitmap_empty_p (may_aliases2))
{
unsigned int k;
sbitmap_iterator sbi;
bitmap_iterator bi;
/* Add all the aliases for TAG2 into TAG1's alias set.
FIXME, update grouping heuristic counters. */
EXECUTE_IF_SET_IN_SBITMAP (may_aliases2, 0, k, sbi)
EXECUTE_IF_SET_IN_BITMAP (may_aliases2, 0, k, bi)
add_may_alias (tag1, referenced_var (k));
sbitmap_a_or_b (may_aliases1, may_aliases1, may_aliases2);
bitmap_ior_into (may_aliases1, may_aliases2);
}
else
{
/* Since TAG2 does not have any aliases of its own, add
TAG2 itself to the alias set of TAG1. */
add_may_alias (tag1, tag2);
SET_BIT (may_aliases1, var_ann (tag2)->uid);
bitmap_set_bit (may_aliases1, DECL_UID (tag2));
}
}
}
@ -1141,14 +1152,14 @@ total_alias_vops_cmp (const void *p, const void *q)
may-aliases(V2) = { TAG } */
static void
group_aliases_into (tree tag, sbitmap tag_aliases, struct alias_info *ai)
group_aliases_into (tree tag, bitmap tag_aliases, struct alias_info *ai)
{
unsigned int i;
var_ann_t tag_ann = var_ann (tag);
size_t num_tag_refs = VARRAY_UINT (ai->num_references, tag_ann->uid);
sbitmap_iterator sbi;
size_t num_tag_refs = NUM_REFERENCES (tag_ann);
bitmap_iterator bi;
EXECUTE_IF_SET_IN_SBITMAP (tag_aliases, 0, i, sbi)
EXECUTE_IF_SET_IN_BITMAP (tag_aliases, 0, i, bi)
{
tree var = referenced_var (i);
var_ann_t ann = var_ann (var);
@ -1256,7 +1267,7 @@ group_aliases (struct alias_info *ai)
{
size_t j;
tree tag1 = var_ann (ai->pointers[i]->var)->type_mem_tag;
sbitmap tag1_aliases = ai->pointers[i]->may_aliases;
bitmap tag1_aliases = ai->pointers[i]->may_aliases;
/* Skip tags that have been grouped already. */
if (ai->pointers[i]->grouped_p)
@ -1267,16 +1278,16 @@ group_aliases (struct alias_info *ai)
aliases into TAG1. */
for (j = i + 1; j < ai->num_pointers; j++)
{
sbitmap tag2_aliases = ai->pointers[j]->may_aliases;
bitmap tag2_aliases = ai->pointers[j]->may_aliases;
if (sbitmap_any_common_bits (tag1_aliases, tag2_aliases))
if (bitmap_intersect_p (tag1_aliases, tag2_aliases))
{
tree tag2 = var_ann (ai->pointers[j]->var)->type_mem_tag;
sbitmap_a_or_b (tag1_aliases, tag1_aliases, tag2_aliases);
bitmap_ior_into (tag1_aliases, tag2_aliases);
/* TAG2 does not need its aliases anymore. */
sbitmap_zero (tag2_aliases);
bitmap_clear (tag2_aliases);
var_ann (tag2)->may_aliases = NULL;
/* TAG1 is the unique alias of TAG2. */
@ -1372,14 +1383,15 @@ create_alias_map_for (tree var, struct alias_info *ai)
static void
setup_pointers_and_addressables (struct alias_info *ai)
{
size_t i, n_vars, num_addressable_vars, num_pointers;
size_t n_vars, num_addressable_vars, num_pointers;
referenced_var_iterator rvi;
tree var;
/* Size up the arrays ADDRESSABLE_VARS and POINTERS. */
num_addressable_vars = num_pointers = 0;
for (i = 0; i < num_referenced_vars; i++)
{
tree var = referenced_var (i);
FOR_EACH_REFERENCED_VAR (var, rvi)
{
if (may_be_aliased (var))
num_addressable_vars++;
@ -1388,7 +1400,7 @@ setup_pointers_and_addressables (struct alias_info *ai)
/* Since we don't keep track of volatile variables, assume that
these pointers are used in indirect store operations. */
if (TREE_THIS_VOLATILE (var))
bitmap_set_bit (ai->dereferenced_ptrs_store, var_ann (var)->uid);
bitmap_set_bit (ai->dereferenced_ptrs_store, DECL_UID (var));
num_pointers++;
}
@ -1410,9 +1422,8 @@ setup_pointers_and_addressables (struct alias_info *ai)
unnecessarily. */
n_vars = num_referenced_vars;
for (i = 0; i < n_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
var_ann_t v_ann = var_ann (var);
subvar_t svars;
@ -1436,7 +1447,7 @@ setup_pointers_and_addressables (struct alias_info *ai)
cleanup passes. */
if (TREE_ADDRESSABLE (var) && v_ann->mem_tag_kind != STRUCT_FIELD)
{
if (!bitmap_bit_p (ai->addresses_needed, v_ann->uid)
if (!bitmap_bit_p (ai->addresses_needed, DECL_UID (var))
&& TREE_CODE (var) != RESULT_DECL
&& !is_global_var (var))
{
@ -1453,8 +1464,8 @@ setup_pointers_and_addressables (struct alias_info *ai)
for (sv = svars; sv; sv = sv->next)
{
var_ann_t svann = var_ann (sv->var);
if (bitmap_bit_p (ai->addresses_needed, svann->uid))
if (bitmap_bit_p (ai->addresses_needed,
DECL_UID (sv->var)))
okay_to_mark = false;
mark_sym_for_renaming (sv->var);
}
@ -1472,13 +1483,13 @@ setup_pointers_and_addressables (struct alias_info *ai)
used when scanning operands for ASM_EXPRs that
clobber memory. In those cases, we need to clobber
all call-clobbered variables and all addressables. */
bitmap_set_bit (addressable_vars, v_ann->uid);
bitmap_set_bit (addressable_vars, DECL_UID (var));
if (var_can_have_subvars (var)
&& (svars = get_subvars_for_var (var)))
{
subvar_t sv;
for (sv = svars; sv; sv = sv->next)
bitmap_set_bit (addressable_vars, var_ann (sv->var)->uid);
bitmap_set_bit (addressable_vars, DECL_UID (sv->var));
}
}
@ -1496,8 +1507,8 @@ setup_pointers_and_addressables (struct alias_info *ai)
array and create a type memory tag for them. */
if (POINTER_TYPE_P (TREE_TYPE (var)))
{
if ((bitmap_bit_p (ai->dereferenced_ptrs_store, v_ann->uid)
|| bitmap_bit_p (ai->dereferenced_ptrs_load, v_ann->uid)))
if ((bitmap_bit_p (ai->dereferenced_ptrs_store, DECL_UID (var))
|| bitmap_bit_p (ai->dereferenced_ptrs_load, DECL_UID (var))))
{
tree tag;
var_ann_t t_ann;
@ -1525,8 +1536,8 @@ setup_pointers_and_addressables (struct alias_info *ai)
/* If pointer VAR has been used in a store operation,
then its memory tag must be marked as written-to. */
if (bitmap_bit_p (ai->dereferenced_ptrs_store, v_ann->uid))
bitmap_set_bit (ai->written_vars, t_ann->uid);
if (bitmap_bit_p (ai->dereferenced_ptrs_store, DECL_UID (var)))
bitmap_set_bit (ai->written_vars, DECL_UID (tag));
/* If pointer VAR is a global variable or a PARM_DECL,
then its memory tag should be considered a global
@ -1537,13 +1548,11 @@ setup_pointers_and_addressables (struct alias_info *ai)
/* All the dereferences of pointer VAR count as
references of TAG. Since TAG can be associated with
several pointers, add the dereferences of VAR to the
TAG. We may need to grow AI->NUM_REFERENCES because
we have been adding name and type tags. */
if (t_ann->uid >= VARRAY_SIZE (ai->num_references))
VARRAY_GROW (ai->num_references, t_ann->uid + 10);
TAG. */
VARRAY_UINT (ai->num_references, t_ann->uid)
+= VARRAY_UINT (ai->num_references, v_ann->uid);
NUM_REFERENCES_SET (t_ann,
NUM_REFERENCES (t_ann) +
NUM_REFERENCES (v_ann));
}
else
{
@ -1643,7 +1652,7 @@ maybe_create_global_var (struct alias_info *ai)
/* Mark all call-clobbered symbols for renaming. Since the initial
rewrite into SSA ignored all call sites, we may need to rename
.GLOBAL_VAR and the call-clobbered variables. */
.GLOBAL_VAR and the call-clobbered variables. */
EXECUTE_IF_SET_IN_BITMAP (call_clobbered_vars, 0, i, bi)
{
tree var = referenced_var (i);
@ -2019,7 +2028,7 @@ add_pointed_to_var (struct alias_info *ai, tree ptr, tree value)
subvar_t sv;
svars = get_subvars_for_var (ref);
uid = var_ann (pt_var)->uid;
uid = DECL_UID (pt_var);
if (pi->pt_vars == NULL)
pi->pt_vars = BITMAP_GGC_ALLOC ();
@ -2032,15 +2041,15 @@ add_pointed_to_var (struct alias_info *ai, tree ptr, tree value)
{
if (overlap_subvar (offset, size, sv, NULL))
{
bitmap_set_bit (pi->pt_vars, var_ann (sv->var)->uid);
bitmap_set_bit (ai->addresses_needed, var_ann (sv->var)->uid);
bitmap_set_bit (pi->pt_vars, DECL_UID (sv->var));
bitmap_set_bit (ai->addresses_needed, DECL_UID (sv->var));
}
}
}
else if (pt_var && SSA_VAR_P (pt_var))
{
uid = var_ann (pt_var)->uid;
uid = DECL_UID (pt_var);
if (pi->pt_vars == NULL)
pi->pt_vars = BITMAP_GGC_ALLOC ();
@ -2053,7 +2062,7 @@ add_pointed_to_var (struct alias_info *ai, tree ptr, tree value)
subvar_t sv;
for (sv = svars; sv; sv = sv->next)
{
uid = var_ann (sv->var)->uid;
uid = DECL_UID (sv->var);
bitmap_set_bit (ai->addresses_needed, uid);
bitmap_set_bit (pi->pt_vars, uid);
}
@ -2419,30 +2428,32 @@ dump_alias_info (FILE *file)
size_t i;
const char *funcname
= lang_hooks.decl_printable_name (current_function_decl, 2);
referenced_var_iterator rvi;
tree var;
fprintf (file, "\nFlow-insensitive alias information for %s\n\n", funcname);
fprintf (file, "Aliased symbols\n\n");
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
if (may_be_aliased (var))
dump_variable (file, var);
}
fprintf (file, "\nDereferenced pointers\n\n");
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
var_ann_t ann = var_ann (var);
if (ann->type_mem_tag)
dump_variable (file, var);
}
fprintf (file, "\nType memory tags\n\n");
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
var_ann_t ann = var_ann (var);
if (ann->mem_tag_kind == TYPE_TAG)
dump_variable (file, var);
@ -2467,9 +2478,9 @@ dump_alias_info (FILE *file)
}
fprintf (file, "\nName memory tags\n\n");
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
var_ann_t ann = var_ann (var);
if (ann->mem_tag_kind == NAME_TAG)
dump_variable (file, var);
@ -2578,19 +2589,19 @@ dump_points_to_info (FILE *file)
{
basic_block bb;
block_stmt_iterator si;
size_t i;
ssa_op_iter iter;
const char *fname =
lang_hooks.decl_printable_name (current_function_decl, 2);
referenced_var_iterator rvi;
tree var;
fprintf (file, "\n\nPointed-to sets for pointers in %s\n\n", fname);
/* First dump points-to information for the default definitions of
pointer variables. This is necessary because default definitions are
not part of the code. */
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
if (POINTER_TYPE_P (TREE_TYPE (var)))
{
var_ann_t ann = var_ann (var);
@ -2710,12 +2721,13 @@ add_type_alias (tree ptr, tree var)
var_ann_t ann = var_ann (ptr);
subvar_t svars;
if (ann->type_mem_tag == NULL_TREE)
{
size_t i;
tree q = NULL_TREE;
tree tag_type = TREE_TYPE (TREE_TYPE (ptr));
HOST_WIDE_INT tag_set = get_alias_set (tag_type);
referenced_var_iterator rvi;
/* PTR doesn't have a type tag, create a new one and add VAR to
the new tag's alias set.
@ -2724,10 +2736,8 @@ add_type_alias (tree ptr, tree var)
whether there is another pointer Q with the same alias set as
PTR. This could be sped up by having type tags associated
with types. */
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (q, rvi)
{
q = referenced_var (i);
if (POINTER_TYPE_P (TREE_TYPE (q))
&& tag_set == get_alias_set (TREE_TYPE (TREE_TYPE (q))))
{
@ -2853,6 +2863,8 @@ new_type_alias (tree ptr, tree var)
}
}
/* This represents the used range of a variable. */
typedef struct used_part
@ -2870,7 +2882,68 @@ typedef struct used_part
/* An array of used_part structures, indexed by variable uid. */
static used_part_t *used_portions;
static htab_t used_portions;
struct used_part_map
{
unsigned int uid;
used_part_t to;
};
/* Return true if the uid in the two used part maps are equal. */
static int
used_part_map_eq (const void *va, const void *vb)
{
const struct used_part_map *a = va, *b = vb;
return (a->uid == b->uid);
}
/* Hash a from uid in a used_part_map. */
static unsigned int
used_part_map_hash (const void *item)
{
return ((const struct used_part_map *)item)->uid;
}
/* Free a used part map element. */
static void
free_used_part_map (void *item)
{
free (((struct used_part_map *)item)->to);
}
/* Lookup a used_part structure for a UID. */
static used_part_t
up_lookup (unsigned int uid)
{
struct used_part_map *h, in;
in.uid = uid;
h = htab_find_with_hash (used_portions, &in, uid);
if (!h)
return NULL;
return h->to;
}
/* Insert the pair UID, TO into the used part hashtable. */
static void
up_insert (unsigned int uid, used_part_t to)
{
struct used_part_map *h;
void **loc;
h = xmalloc (sizeof (struct used_part_map));
h->uid = uid;
h->to = to;
loc = htab_find_slot_with_hash (used_portions, h,
uid, INSERT);
*(struct used_part_map **) loc = h;
}
/* Given a variable uid, UID, get or create the entry in the used portions
table for the variable. */
@ -2879,7 +2952,7 @@ static used_part_t
get_or_create_used_part_for (size_t uid)
{
used_part_t up;
if (used_portions[uid] == NULL)
if ((up = up_lookup (uid)) == NULL)
{
up = xcalloc (1, sizeof (struct used_part));
up->minused = INT_MAX;
@ -2887,8 +2960,7 @@ get_or_create_used_part_for (size_t uid)
up->explicit_uses = false;
up->implicit_uses = false;
}
else
up = used_portions[uid];
return up;
}
@ -2901,12 +2973,12 @@ create_overlap_variables_for (tree var)
{
VEC(fieldoff_s,heap) *fieldstack = NULL;
used_part_t up;
size_t uid = var_ann (var)->uid;
size_t uid = DECL_UID (var);
if (used_portions[uid] == NULL)
if (!up_lookup (uid))
return;
up = used_portions[uid];
up = up_lookup (uid);
push_fields_onto_fieldstack (TREE_TYPE (var), &fieldstack, 0, NULL);
if (VEC_length (fieldoff_s, fieldstack) != 0)
{
@ -3079,7 +3151,7 @@ find_used_portions (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
&unsignedp, &volatilep, false);
if (DECL_P (ref) && offset == NULL && bitsize != -1)
{
size_t uid = var_ann (ref)->uid;
size_t uid = DECL_UID (ref);
used_part_t up;
up = get_or_create_used_part_for (uid);
@ -3090,7 +3162,7 @@ find_used_portions (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
up->maxused = bitpos + bitsize;
up->explicit_uses = true;
used_portions[uid] = up;
up_insert (uid, up);
*walk_subtrees = 0;
return NULL_TREE;
@ -3102,7 +3174,7 @@ find_used_portions (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
&& TREE_CODE (DECL_SIZE (ref)) == INTEGER_CST)
{
used_part_t up;
size_t uid = var_ann (ref)->uid;
size_t uid = DECL_UID (ref);
up = get_or_create_used_part_for (uid);
@ -3111,7 +3183,7 @@ find_used_portions (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
up->implicit_uses = true;
used_portions[uid] = up;
up_insert (uid, up);
*walk_subtrees = 0;
return NULL_TREE;
@ -3134,7 +3206,7 @@ find_used_portions (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
&& TREE_CODE (DECL_SIZE (var)) == INTEGER_CST)
{
used_part_t up;
size_t uid = var_ann (var)->uid;
size_t uid = DECL_UID (var);
up = get_or_create_used_part_for (uid);
@ -3142,7 +3214,7 @@ find_used_portions (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
up->maxused = TREE_INT_CST_LOW (DECL_SIZE (var));
up->implicit_uses = true;
used_portions[uid] = up;
up_insert (uid, up);
*walk_subtrees = 0;
return NULL_TREE;
}
@ -3157,7 +3229,7 @@ find_used_portions (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
&& TREE_CODE (DECL_SIZE (var)) == INTEGER_CST)
{
used_part_t up;
size_t uid = var_ann (var)->uid;
size_t uid = DECL_UID (var);
up = get_or_create_used_part_for (uid);
@ -3165,7 +3237,7 @@ find_used_portions (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
up->maxused = TREE_INT_CST_LOW (DECL_SIZE (var));
up->implicit_uses = true;
used_portions[uid] = up;
up_insert (uid, up);
*walk_subtrees = 0;
return NULL_TREE;
}
@ -3179,22 +3251,17 @@ find_used_portions (tree *tp, int *walk_subtrees, void *data ATTRIBUTE_UNUSED)
return NULL_TREE;
}
/* We are about to create some new referenced variables, and we need the
before size. */
static size_t old_referenced_vars;
/* Create structure field variables for structures used in this function. */
static void
create_structure_vars (void)
{
basic_block bb;
size_t i;
referenced_var_iterator rvi;
tree var;
old_referenced_vars = num_referenced_vars;
used_portions = xcalloc (num_referenced_vars, sizeof (used_part_t));
used_portions = htab_create (10, used_part_map_hash, used_part_map_eq,
free_used_part_map);
FOR_EACH_BB (bb)
{
@ -3206,9 +3273,8 @@ create_structure_vars (void)
NULL);
}
}
for (i = 0; i < old_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
/* The C++ FE creates vars without DECL_SIZE set, for some reason. */
if (var
&& DECL_SIZE (var)
@ -3217,10 +3283,8 @@ create_structure_vars (void)
&& TREE_CODE (DECL_SIZE (var)) == INTEGER_CST)
create_overlap_variables_for (var);
}
for (i = 0; i < old_referenced_vars; i++)
free (used_portions[i]);
htab_delete (used_portions);
free (used_portions);
}
static bool

View File

@ -341,18 +341,15 @@ create_ssa_var_map (int flags)
var_map map;
ssa_op_iter iter;
#ifdef ENABLE_CHECKING
sbitmap used_in_real_ops;
sbitmap used_in_virtual_ops;
bitmap used_in_real_ops;
bitmap used_in_virtual_ops;
#endif
map = init_var_map (num_ssa_names + 1);
#ifdef ENABLE_CHECKING
used_in_real_ops = sbitmap_alloc (num_referenced_vars);
sbitmap_zero (used_in_real_ops);
used_in_virtual_ops = sbitmap_alloc (num_referenced_vars);
sbitmap_zero (used_in_virtual_ops);
used_in_real_ops = BITMAP_ALLOC (NULL);
used_in_virtual_ops = BITMAP_ALLOC (NULL);
#endif
if (flags & SSA_VAR_MAP_REF_COUNT)
@ -389,7 +386,7 @@ create_ssa_var_map (int flags)
register_ssa_partition (map, use, true);
#ifdef ENABLE_CHECKING
SET_BIT (used_in_real_ops, var_ann (SSA_NAME_VAR (use))->uid);
bitmap_set_bit (used_in_real_ops, DECL_UID (SSA_NAME_VAR (use)));
#endif
}
@ -398,7 +395,7 @@ create_ssa_var_map (int flags)
register_ssa_partition (map, dest, false);
#ifdef ENABLE_CHECKING
SET_BIT (used_in_real_ops, var_ann (SSA_NAME_VAR (dest))->uid);
bitmap_set_bit (used_in_real_ops, DECL_UID (SSA_NAME_VAR (dest)));
#endif
}
@ -407,7 +404,8 @@ create_ssa_var_map (int flags)
FOR_EACH_SSA_TREE_OPERAND (use, stmt, iter,
SSA_OP_VIRTUAL_USES | SSA_OP_VMUSTDEF)
{
SET_BIT (used_in_virtual_ops, var_ann (SSA_NAME_VAR (use))->uid);
bitmap_set_bit (used_in_virtual_ops,
DECL_UID (SSA_NAME_VAR (use)));
}
#endif /* ENABLE_CHECKING */
@ -419,21 +417,21 @@ create_ssa_var_map (int flags)
#if defined ENABLE_CHECKING
{
unsigned i;
sbitmap both = sbitmap_alloc (num_referenced_vars);
sbitmap_a_and_b (both, used_in_real_ops, used_in_virtual_ops);
if (sbitmap_first_set_bit (both) >= 0)
bitmap both = BITMAP_ALLOC (NULL);
bitmap_and (both, used_in_real_ops, used_in_virtual_ops);
if (!bitmap_empty_p (both))
{
sbitmap_iterator sbi;
bitmap_iterator bi;
EXECUTE_IF_SET_IN_SBITMAP (both, 0, i, sbi)
EXECUTE_IF_SET_IN_BITMAP (both, 0, i, bi)
fprintf (stderr, "Variable %s used in real and virtual operands\n",
get_name (referenced_var (i)));
internal_error ("SSA corruption");
}
sbitmap_free (used_in_real_ops);
sbitmap_free (used_in_virtual_ops);
sbitmap_free (both);
BITMAP_FREE (used_in_real_ops);
BITMAP_FREE (used_in_virtual_ops);
BITMAP_FREE (both);
}
#endif

View File

@ -1165,20 +1165,14 @@ gather_mem_refs_stmt (struct loop *loop, htab_t mem_refs,
FOR_EACH_SSA_TREE_OPERAND (vname, stmt, oi,
SSA_OP_VIRTUAL_USES | SSA_OP_VIRTUAL_KILLS)
{
bitmap_set_bit (ref->vops,
var_ann (SSA_NAME_VAR (vname))->uid);
}
bitmap_set_bit (ref->vops, DECL_UID (SSA_NAME_VAR (vname)));
record_mem_ref_loc (&ref->locs, stmt, mem);
return;
fail:
FOR_EACH_SSA_TREE_OPERAND (vname, stmt, oi,
SSA_OP_VIRTUAL_USES | SSA_OP_VIRTUAL_KILLS)
{
bitmap_set_bit (clobbered_vops,
var_ann (SSA_NAME_VAR (vname))->uid);
}
bitmap_set_bit (clobbered_vops, DECL_UID (SSA_NAME_VAR (vname)));
}
/* Gathers memory references in LOOP. Notes vops accessed through unrecognized

View File

@ -1555,10 +1555,10 @@ get_asm_expr_operands (tree stmt)
add_stmt_operand (&global_var, s_ann, opf_is_def);
else
EXECUTE_IF_SET_IN_BITMAP (call_clobbered_vars, 0, i, bi)
{
tree var = referenced_var (i);
add_stmt_operand (&var, s_ann, opf_is_def | opf_non_specific);
}
{
tree var = referenced_var (i);
add_stmt_operand (&var, s_ann, opf_is_def | opf_non_specific);
}
/* Now clobber all addressables. */
EXECUTE_IF_SET_IN_BITMAP (addressable_vars, 0, i, bi)
@ -1937,10 +1937,10 @@ note_addressable (tree var, stmt_ann_t s_ann)
{
subvar_t sv;
for (sv = svars; sv; sv = sv->next)
bitmap_set_bit (s_ann->addresses_taken, var_ann (sv->var)->uid);
bitmap_set_bit (s_ann->addresses_taken, DECL_UID (sv->var));
}
else
bitmap_set_bit (s_ann->addresses_taken, var_ann (var)->uid);
bitmap_set_bit (s_ann->addresses_taken, DECL_UID (var));
}
}

View File

@ -2913,13 +2913,13 @@ set_uids_in_ptset (bitmap into, bitmap from)
subvar_t svars = get_subvars_for_var (vi->decl);
subvar_t sv;
for (sv = svars; sv; sv = sv->next)
bitmap_set_bit (into, var_ann (sv->var)->uid);
bitmap_set_bit (into, DECL_UID (sv->var));
}
/* We may end up with labels in the points-to set because people
take their address, and they are _DECL's. */
else if (TREE_CODE (vi->decl) == VAR_DECL
|| TREE_CODE (vi->decl) == PARM_DECL)
bitmap_set_bit (into, var_ann (vi->decl)->uid);
bitmap_set_bit (into, DECL_UID (vi->decl));
}

View File

@ -373,17 +373,16 @@ error:
static void
verify_flow_insensitive_alias_info (void)
{
size_t i;
tree var;
bitmap visited = BITMAP_ALLOC (NULL);
referenced_var_iterator rvi;
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
size_t j;
var_ann_t ann;
varray_type may_aliases;
var = referenced_var (i);
ann = var_ann (var);
may_aliases = ann->may_aliases;
@ -391,7 +390,7 @@ verify_flow_insensitive_alias_info (void)
{
tree alias = VARRAY_TREE (may_aliases, j);
bitmap_set_bit (visited, var_ann (alias)->uid);
bitmap_set_bit (visited, DECL_UID (alias));
if (!may_be_aliased (alias))
{
@ -402,16 +401,14 @@ verify_flow_insensitive_alias_info (void)
}
}
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
var_ann_t ann;
var = referenced_var (i);
ann = var_ann (var);
if (ann->mem_tag_kind == NOT_A_TAG
&& ann->is_alias_tag
&& !bitmap_bit_p (visited, ann->uid))
&& !bitmap_bit_p (visited, DECL_UID (var)))
{
error ("Addressable variable that is an alias tag but is not in any alias set.");
goto err;
@ -554,13 +551,13 @@ verify_name_tags (void)
for (i = 0; aliases && i < VARRAY_ACTIVE_SIZE (aliases); i++)
{
tree alias = VARRAY_TREE (aliases, i);
bitmap_set_bit (type_aliases, var_ann (alias)->uid);
bitmap_set_bit (type_aliases, DECL_UID (alias));
}
/* When grouping, we may have added PTR's type tag into the
alias set of PTR's name tag. To prevent a false
positive, pretend that TMT is in its own alias set. */
bitmap_set_bit (type_aliases, var_ann (tmt)->uid);
bitmap_set_bit (type_aliases, DECL_UID (tmt));
if (bitmap_equal_p (type_aliases, pi->pt_vars))
continue;
@ -780,13 +777,31 @@ err:
internal_error ("verify_ssa failed.");
}
/* Return true if the uid in both int tree maps are equal. */
int
int_tree_map_eq (const void *va, const void *vb)
{
const struct int_tree_map *a = va, *b = vb;
return (a->uid == b->uid);
}
/* Hash a UID in a int_tree_map. */
unsigned int
int_tree_map_hash (const void *item)
{
return ((const struct int_tree_map *)item)->uid;
}
/* Initialize global DFA and SSA structures. */
void
init_tree_ssa (void)
{
referenced_vars = VEC_alloc (tree, gc, 20);
referenced_vars = htab_create_ggc (20, int_tree_map_hash,
int_tree_map_eq, NULL);
call_clobbered_vars = BITMAP_ALLOC (NULL);
addressable_vars = BITMAP_ALLOC (NULL);
init_ssanames ();
@ -804,6 +819,8 @@ delete_tree_ssa (void)
size_t i;
basic_block bb;
block_stmt_iterator bsi;
referenced_var_iterator rvi;
tree var;
/* Release any ssa_names still in use. */
for (i = 0; i < num_ssa_names; i++)
@ -833,13 +850,13 @@ delete_tree_ssa (void)
}
/* Remove annotations from every referenced variable. */
for (i = 0; i < num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = referenced_var (i);
ggc_free (var->common.ann);
var->common.ann = NULL;
}
VEC_free (tree, gc, referenced_vars);
htab_delete (referenced_vars);
referenced_vars = NULL;
fini_ssanames ();
fini_phinodes ();

View File

@ -259,7 +259,7 @@ find_va_list_reference (tree *tp, int *walk_subtrees ATTRIBUTE_UNUSED,
var = SSA_NAME_VAR (var);
if (TREE_CODE (var) == VAR_DECL
&& bitmap_bit_p (va_list_vars, var_ann (var)->uid))
&& bitmap_bit_p (va_list_vars, DECL_UID (var)))
return var;
return NULL_TREE;
@ -337,12 +337,12 @@ va_list_counter_struct_op (struct stdarg_info *si, tree ap, tree var,
return false;
if (TREE_CODE (var) != SSA_NAME
|| bitmap_bit_p (si->va_list_vars, var_ann (SSA_NAME_VAR (var))->uid))
|| bitmap_bit_p (si->va_list_vars, DECL_UID (SSA_NAME_VAR (var))))
return false;
base = get_base_address (ap);
if (TREE_CODE (base) != VAR_DECL
|| !bitmap_bit_p (si->va_list_vars, var_ann (base)->uid))
|| !bitmap_bit_p (si->va_list_vars, DECL_UID (base)))
return false;
if (TREE_OPERAND (ap, 1) == va_list_gpr_counter_field)
@ -361,12 +361,12 @@ static bool
va_list_ptr_read (struct stdarg_info *si, tree ap, tree tem)
{
if (TREE_CODE (ap) != VAR_DECL
|| !bitmap_bit_p (si->va_list_vars, var_ann (ap)->uid))
|| !bitmap_bit_p (si->va_list_vars, DECL_UID (ap)))
return false;
if (TREE_CODE (tem) != SSA_NAME
|| bitmap_bit_p (si->va_list_vars,
var_ann (SSA_NAME_VAR (tem))->uid)
DECL_UID (SSA_NAME_VAR (tem)))
|| is_global_var (SSA_NAME_VAR (tem)))
return false;
@ -396,7 +396,7 @@ va_list_ptr_read (struct stdarg_info *si, tree ap, tree tem)
/* Note the temporary, as we need to track whether it doesn't escape
the current function. */
bitmap_set_bit (si->va_list_escape_vars,
var_ann (SSA_NAME_VAR (tem))->uid);
DECL_UID (SSA_NAME_VAR (tem)));
return true;
}
@ -413,11 +413,11 @@ va_list_ptr_write (struct stdarg_info *si, tree ap, tree tem2)
unsigned HOST_WIDE_INT increment;
if (TREE_CODE (ap) != VAR_DECL
|| !bitmap_bit_p (si->va_list_vars, var_ann (ap)->uid))
|| !bitmap_bit_p (si->va_list_vars, DECL_UID (ap)))
return false;
if (TREE_CODE (tem2) != SSA_NAME
|| bitmap_bit_p (si->va_list_vars, var_ann (SSA_NAME_VAR (tem2))->uid))
|| bitmap_bit_p (si->va_list_vars, DECL_UID (SSA_NAME_VAR (tem2))))
return false;
if (si->compute_sizes <= 0)
@ -455,7 +455,7 @@ check_va_list_escapes (struct stdarg_info *si, tree lhs, tree rhs)
if (TREE_CODE (rhs) != SSA_NAME
|| ! bitmap_bit_p (si->va_list_escape_vars,
var_ann (SSA_NAME_VAR (rhs))->uid))
DECL_UID (SSA_NAME_VAR (rhs))))
return;
if (TREE_CODE (lhs) != SSA_NAME || is_global_var (SSA_NAME_VAR (lhs)))
@ -495,7 +495,7 @@ check_va_list_escapes (struct stdarg_info *si, tree lhs, tree rhs)
}
bitmap_set_bit (si->va_list_escape_vars,
var_ann (SSA_NAME_VAR (lhs))->uid);
DECL_UID (SSA_NAME_VAR (lhs)));
}
@ -519,7 +519,7 @@ check_all_va_list_escapes (struct stdarg_info *si)
FOR_EACH_SSA_TREE_OPERAND (use, stmt, iter, SSA_OP_ALL_USES)
{
if (! bitmap_bit_p (si->va_list_escape_vars,
var_ann (SSA_NAME_VAR (use))->uid))
DECL_UID (SSA_NAME_VAR (use))))
continue;
if (TREE_CODE (stmt) == MODIFY_EXPR)
@ -565,12 +565,12 @@ check_all_va_list_escapes (struct stdarg_info *si)
{
if (TREE_CODE (lhs) == SSA_NAME
&& bitmap_bit_p (si->va_list_escape_vars,
var_ann (SSA_NAME_VAR (lhs))->uid))
DECL_UID (SSA_NAME_VAR (lhs))))
continue;
if (TREE_CODE (lhs) == VAR_DECL
&& bitmap_bit_p (si->va_list_vars,
var_ann (lhs)->uid))
DECL_UID (lhs)))
continue;
}
}
@ -690,7 +690,7 @@ execute_optimize_stdarg (void)
break;
}
bitmap_set_bit (si.va_list_vars, var_ann (ap)->uid);
bitmap_set_bit (si.va_list_vars, DECL_UID (ap));
/* VA_START_BB and VA_START_AP will be only used if there is just
one va_start in the function. */

View File

@ -132,16 +132,17 @@ static void find_tail_calls (basic_block, struct tailcall **);
static bool
suitable_for_tail_opt_p (void)
{
int i;
referenced_var_iterator rvi;
tree var;
if (current_function_stdarg)
return false;
/* No local variable nor structure field should be call-clobbered. We
ignore any kind of memory tag, as these are not real variables. */
for (i = 0; i < (int) num_referenced_vars; i++)
FOR_EACH_REFERENCED_VAR (var, rvi)
{
tree var = VEC_index (tree, referenced_vars, i);
if (!(TREE_STATIC (var) || DECL_EXTERNAL (var))
&& (var_ann (var)->mem_tag_kind == NOT_A_TAG

View File

@ -1,3 +1,8 @@
2005-06-30 Daniel Berlin <dberlin@dberlin.org>
* hashtab.h (HTAB_DELETED_ENTRY): New macro.
(HTAB_EMPTY_ENTRY): New macro.
2005-06-20 Geoffrey Keating <geoffk@apple.com>
* libiberty.h (strverscmp): Prototype.

View File

@ -81,6 +81,15 @@ typedef void (*htab_free) (void *);
typedef void *(*htab_alloc_with_arg) (void *, size_t, size_t);
typedef void (*htab_free_with_arg) (void *, void *);
/* This macro defines reserved value for empty table entry. */
#define HTAB_EMPTY_ENTRY ((PTR) 0)
/* This macro defines reserved value for table entry which contained
a deleted element. */
#define HTAB_DELETED_ENTRY ((PTR) 1)
/* Hash tables are of the following type. The structure
(implementation) of this type is not needed for using the hash
tables. All work with hash table should be executed only through

View File

@ -1,3 +1,8 @@
2005-06-30 Daniel Berlin <dberlin@dberlin.org>
* hashtab.c (EMPTY_ENTRY): Moved and renamed.
(DELETED_ENTRY): Ditto.
2005-06-20 Geoffrey Keating <geoffk@apple.com>
* strverscmp.c: New.

View File

@ -64,15 +64,6 @@ Boston, MA 02110-1301, USA. */
#define CHAR_BIT 8
#endif
/* This macro defines reserved value for empty table entry. */
#define EMPTY_ENTRY ((PTR) 0)
/* This macro defines reserved value for table entry which contained
a deleted element. */
#define DELETED_ENTRY ((PTR) 1)
static unsigned int higher_prime_index (unsigned long);
static hashval_t htab_mod_1 (hashval_t, hashval_t, hashval_t, int);
static hashval_t htab_mod (hashval_t, htab_t);
@ -290,7 +281,7 @@ htab_mod_m2 (hashval_t hash, htab_t htab)
/* This function creates table with length slightly longer than given
source length. Created hash table is initiated as empty (all the
hash table entries are EMPTY_ENTRY). The function returns the
hash table entries are HTAB_EMPTY_ENTRY). The function returns the
created hash table, or NULL if memory allocation fails. */
htab_t
@ -401,7 +392,7 @@ htab_delete (htab_t htab)
if (htab->del_f)
for (i = size - 1; i >= 0; i--)
if (entries[i] != EMPTY_ENTRY && entries[i] != DELETED_ENTRY)
if (entries[i] != HTAB_EMPTY_ENTRY && entries[i] != HTAB_DELETED_ENTRY)
(*htab->del_f) (entries[i]);
if (htab->free_f != NULL)
@ -427,7 +418,7 @@ htab_empty (htab_t htab)
if (htab->del_f)
for (i = size - 1; i >= 0; i--)
if (entries[i] != EMPTY_ENTRY && entries[i] != DELETED_ENTRY)
if (entries[i] != HTAB_EMPTY_ENTRY && entries[i] != HTAB_DELETED_ENTRY)
(*htab->del_f) (entries[i]);
memset (entries, 0, size * sizeof (PTR));
@ -448,9 +439,9 @@ find_empty_slot_for_expand (htab_t htab, hashval_t hash)
PTR *slot = htab->entries + index;
hashval_t hash2;
if (*slot == EMPTY_ENTRY)
if (*slot == HTAB_EMPTY_ENTRY)
return slot;
else if (*slot == DELETED_ENTRY)
else if (*slot == HTAB_DELETED_ENTRY)
abort ();
hash2 = htab_mod_m2 (hash, htab);
@ -461,9 +452,9 @@ find_empty_slot_for_expand (htab_t htab, hashval_t hash)
index -= size;
slot = htab->entries + index;
if (*slot == EMPTY_ENTRY)
if (*slot == HTAB_EMPTY_ENTRY)
return slot;
else if (*slot == DELETED_ENTRY)
else if (*slot == HTAB_DELETED_ENTRY)
abort ();
}
}
@ -523,7 +514,7 @@ htab_expand (htab_t htab)
{
PTR x = *p;
if (x != EMPTY_ENTRY && x != DELETED_ENTRY)
if (x != HTAB_EMPTY_ENTRY && x != HTAB_DELETED_ENTRY)
{
PTR *q = find_empty_slot_for_expand (htab, (*htab->hash_f) (x));
@ -556,8 +547,8 @@ htab_find_with_hash (htab_t htab, const PTR element, hashval_t hash)
index = htab_mod (hash, htab);
entry = htab->entries[index];
if (entry == EMPTY_ENTRY
|| (entry != DELETED_ENTRY && (*htab->eq_f) (entry, element)))
if (entry == HTAB_EMPTY_ENTRY
|| (entry != HTAB_DELETED_ENTRY && (*htab->eq_f) (entry, element)))
return entry;
hash2 = htab_mod_m2 (hash, htab);
@ -569,8 +560,8 @@ htab_find_with_hash (htab_t htab, const PTR element, hashval_t hash)
index -= size;
entry = htab->entries[index];
if (entry == EMPTY_ENTRY
|| (entry != DELETED_ENTRY && (*htab->eq_f) (entry, element)))
if (entry == HTAB_EMPTY_ENTRY
|| (entry != HTAB_DELETED_ENTRY && (*htab->eq_f) (entry, element)))
return entry;
}
}
@ -615,9 +606,9 @@ htab_find_slot_with_hash (htab_t htab, const PTR element,
first_deleted_slot = NULL;
entry = htab->entries[index];
if (entry == EMPTY_ENTRY)
if (entry == HTAB_EMPTY_ENTRY)
goto empty_entry;
else if (entry == DELETED_ENTRY)
else if (entry == HTAB_DELETED_ENTRY)
first_deleted_slot = &htab->entries[index];
else if ((*htab->eq_f) (entry, element))
return &htab->entries[index];
@ -631,9 +622,9 @@ htab_find_slot_with_hash (htab_t htab, const PTR element,
index -= size;
entry = htab->entries[index];
if (entry == EMPTY_ENTRY)
if (entry == HTAB_EMPTY_ENTRY)
goto empty_entry;
else if (entry == DELETED_ENTRY)
else if (entry == HTAB_DELETED_ENTRY)
{
if (!first_deleted_slot)
first_deleted_slot = &htab->entries[index];
@ -649,7 +640,7 @@ htab_find_slot_with_hash (htab_t htab, const PTR element,
if (first_deleted_slot)
{
htab->n_deleted--;
*first_deleted_slot = EMPTY_ENTRY;
*first_deleted_slot = HTAB_EMPTY_ENTRY;
return first_deleted_slot;
}
@ -688,13 +679,13 @@ htab_remove_elt_with_hash (htab_t htab, PTR element, hashval_t hash)
PTR *slot;
slot = htab_find_slot_with_hash (htab, element, hash, NO_INSERT);
if (*slot == EMPTY_ENTRY)
if (*slot == HTAB_EMPTY_ENTRY)
return;
if (htab->del_f)
(*htab->del_f) (*slot);
*slot = DELETED_ENTRY;
*slot = HTAB_DELETED_ENTRY;
htab->n_deleted++;
}
@ -706,13 +697,13 @@ void
htab_clear_slot (htab_t htab, PTR *slot)
{
if (slot < htab->entries || slot >= htab->entries + htab_size (htab)
|| *slot == EMPTY_ENTRY || *slot == DELETED_ENTRY)
|| *slot == HTAB_EMPTY_ENTRY || *slot == HTAB_DELETED_ENTRY)
abort ();
if (htab->del_f)
(*htab->del_f) (*slot);
*slot = DELETED_ENTRY;
*slot = HTAB_DELETED_ENTRY;
htab->n_deleted++;
}
@ -734,7 +725,7 @@ htab_traverse_noresize (htab_t htab, htab_trav callback, PTR info)
{
PTR x = *slot;
if (x != EMPTY_ENTRY && x != DELETED_ENTRY)
if (x != HTAB_EMPTY_ENTRY && x != HTAB_DELETED_ENTRY)
if (!(*callback) (slot, info))
break;
}