This work focuses on the communication perspective of decentralized learning over wireless networks, using consensus-based decentralized stochastic gradient descent (D-SGD). Considering the actual communication cost or delay caused by innetwork information exchange in every iteration, our goal is to achieve fast convergence of the algorithm measured by improvement per transmission slot. We propose BASS, an efficient communication framework for D-SGD over wireless networks with broadcast-based subgraph sampling. More explicitly, in every iteration, we activate multiple subsets of non-interfering nodes to broadcast model updates to their neighbors. These subsets are activated randomly over time with some probabilities under a given communication cost (e.g., number of transmission slots per iteration). During the consensus update step, only bi-directional links are effectively considered to preserve the communication symmetry. As compared to existing link-based scheduling methods, the broadcasting nature of wireless channels provides inherent advantages in speeding up convergence of decentralized learning by creating more communicated links under the same number of transmission slots.