It's also often highly effective to bring in consultants. A consultant has the authority of a certified Expert and has both more patience and more skill at the necessary meetings than most technical staff. The consultant also doesn't have the complicated existing relationships with people and can afford to cheerfully be a scapegoat for things people don't like forever after. In this situation, you want a consultant whose strengths are politics and authority, not necessarily the most technical person you can find.
One major computer manufacturer had a policy forbidding dial-in modems. Unfortunately, the company's centralized dial-in access didn't satisfy all of their programmers. Although the programmers couldn't request modem lines, some of them figured out that they could unplug the telephone lines from fax machines, connect them to modems, go home at night, and dial up their work computers. Even more unfortunately, a programmer in one of the groups with this habit was fired and proceeded to break into the site. He systematically tried all the phone numbers in the range the company had assigned to fax machines until he connected to one of the redirected ones and got a login prompt from an unsecured machine inside the corporate firewall. The former employee did significant damage before he was detected and shut out. He was able to gain a lot of time because the people trying to shut him out didn't know the modems existed. When they did figure out that modems were involved, the process of getting rid of them all proved to be tedious and prolonged, because lines were diverted only when people planned to use them.
That whole incident was the result of the fact that management and system administrators had a policy that ignored some genuine needs of the people using the computer facility. The official policy required dial-in access to be so secure it was almost completely unusable, and the unofficial policy required dial-in access to be so usable that it was almost completely insecure. If a policy that allowed moderately insecure dial-in access had been in place, the break-in might have been avoided, and it certainly would have been easier to detect and stop. It would also have been avoided if the programmers had agreed that security was more important than dial-in access, but that kind of agreement is much harder to achieve than a compromise.
In fact, there wasn't much actual disagreement between the parties involved in this case. If the managers had been asked, they would have said that letting people work from home was important to them; they didn't understand that the existing dial-in system was not providing acceptable service. If the programmers had been asked, they would have said that preventing people from maliciously deleting their work was important to them; they didn't understand the risks of what they were doing. But nobody thought about security and usability at the same time, and the result was pure disaster.
Sometimes managers have a genuine willingness to accept risks that seem overwhelming to system administrators. For example, one computer manufacturer chose to put one of their large and powerful machines on an unprotected network and to give accounts on the machine to customers and prospective customers upon request. The system administrator thought it was a terrible idea and pointed out that the machine was fundamentally impossible to secure; there were a large number of accounts, changing rapidly, with no pattern, and they belonged to people the company couldn't control. Furthermore, the reason the company was giving out test accounts was that the machine was a fast parallel processor, which also meant that it might as well have been designed as the ultimate password-cracking machine. To the system administrator, it seemed extremely likely that once this machine was broken into (which was probably inevitable), it was going to be used as a tool to break into other machines.
A battle ensued, and eventually, a compromise was reached. The machine was made available, but extra security was employed to protect internal networks from it. (It was a compromise because it interfered with employees' abilities to use the machine, which they needed to do to assist the outsiders who were using it.) anagement chose to accept the remaining risk that the machine would be used as a platform to attack other sites, knowing that there was a potential for extremely bad publicity as a result.
What happened? Sure enough, the machine was compromised and was used to attack at least the internal networks. The attacks on the internal networks were extremely annoying and cost the company money in system administrators' time, but the attacks didn't produce significant damage, and there was little or no bad publicity. Management considered this expense to be acceptable, however, given the sales generated by letting people test-drive the machine. In this case, conflicting security policies were resolved explicitly -- by discussion and compromise -- and the result was a policy that seemed less strong than the original, but that provided sufficient protection. By openly and intentionally choosing to accept a risk, the company brought it within acceptable limits.
Talk to each of these people in terms they care about. This requires a lot of listening, and probably some research, before you ever start talking. To managers, talk about things like probable costs and potential losses; to executives, talk about risk versus benefit; and to technical staff, talk about capabilities. Before you present a proposal, be prepared with an explanation that suits your audience's point of view and technical level. If you have trouble understanding or communicating with a particular group, you may find it helps to build a relationship with someone who understands that group and can translate for you.
Be prepared to think about other people's issues in other people's terms, which means that you're going to give different explanations to different people. You're not trying to deceive anybody. The basic information is the same, no matter who you're talking to. On the other hand, if a particular decision saves money and makes for a more enjoyable working environment, you don't go to the chief financial officer and say "We want to do it this way because it's more fun", and then go the programmers and say "We want to do it this way because it's cheaper".
If you are a technical person, you may initially despair at the idea that you need to discuss security in terms of money. In particular, you may feel that you can't possibly come up with the "right" answer. You don't need to come up with the right answer. Nobody could possibly actually say how much a given security policy costs -- hardware and software costs are usually easy, but then you have the time to set it up, the management meetings to argue about it, the maintenance, the extra five minutes a day for every programmer to log in, the changes to other systems. Saying how much money it saves is even worse; generally, the worst-case possibility is utter disaster, costing any amount of money your imagination can dream up or your organization can dredge up from the bottom of its pockets. However, that's so implausible you can't use it, so you have to guess how much more mundane incidents will cost you and how. Will people sue you? Will you lose customers? Will you lose control of a valuable asset? This process is not going to come up with answers that will make a technical person happy. That's OK. Come up with a method of estimating that you find plausible and that gives the results you want, attach equally plausible numbers to it, chart them, and present them. You can be perfectly honest about the fact that they're imprecise; the important thing is that you have numbers, and that you believe the justification for those numbers, no matter how accurate (or inaccurate) you think the result is. In general, you're not expected to produce absolute truth.
In particular, people need to know about the consequences of their decisions, including best, worst, and probable outcomes. Consequences that are obvious to you may not be obvious to other people. For example, people who are not knowledgeable about Unix may be quite willing to give out root passwords. They don't realize what the implications are, and they may be very upset when they find out.
People who have been surprised often overreact. They may go from completely unconcerned to demanding the impossible. One good break-in, or even a prank, can convert people from not understanding all the fuss about passwords to inquiring about the availability of voiceprint identification and machine gun turrets. (It's preferable to get them to make decisions while they are mildly worried, instead of blindly panicked!)
Don't offer people decisions unless they have both the authority and the information with which to make those decisions. You don't want somebody to get attached to a decision, only to have it overruled from higher up (or worse yet, from somebody at their level but with the appropriate span of control). Always make it clear why they're being asked to decide (instead of having the decision made somewhere else).
In most cases, you want to avoid open-ended questions. It's better to ask "Should we invest money in a single place to be a defense, or should we try to protect each machine individually?" than "What do you think we should do about Internet security?" (The open question gives the replier the option of saying "nothing", which is probably not an answer you're going to be happy with.) In most cases, it's better yet to say "Should we spend about $5,000 on a single defensive system, or $15,000 on protecting each machine individually?"
When you explain policies or procedures, explain them in terms of the original decisions. Show people the reasoning process. If you find that you can't do so, either the original decisions didn't cover some issues that are important to you (maybe so important you didn't think they needed to be mentioned), or the policies and procedures are unfounded and possibly unreasonable.
Another common example of misdirected concern involves managers worrying that employees will distribute confidential information over the Internet. Again, this usually isn't a technical problem; it's a management problem. The same employee who could email your source code to a competitor could also carry it out the door in his pocket on an zip disk (generally far more conveniently and with less chance of being caught). It is irrational to place technological restrictions on information that can be sent out by email unless you also check everybody's bags and pockets as they leave the premises.